Problem Statement

Pneumonia is an infection in one or both lungs. Bacteria, viruses, and fungi cause it. The infection causes inflammation in the air sacs in your lungs, which are called alveoli. Pneumonia accounts for over 15% of all deaths of children under 5 years old internationally. In 2017, 920,000 children under the age of 5 died from the disease. It requires review of a chest radiograph (CXR) by highly trained specialists and confirmation through clinical history, vital signs and laboratory exams. Pneumonia usually manifests as an area or areas of increased opacity on CXR. However, the diagnosis of pneumonia on CXR is complicated because of a number of other conditions in the lungs such as fluid overload (pulmonary edema), bleeding, volume loss (atelectasis or collapse), lung cancer, or post- radiation or surgical changes. Outside of the lungs, fluid in the pleural space (pleural effusion) also appears as increased opacity on CXR. When available, comparison of CXRs of the patient taken at different time points and correlation with clinical symptoms and history are helpful in making the diagnosis. CXRs are the most commonly performed diagnostic imaging study. A number of factors such as positioning of the patient and depth of inspiration can alter the appearance of the CXR, complicating interpretation further. In addition, clinicians are faced with reading high volumes of images every shift. Pneumonia Detection Now to detection Pneumonia we need to detect Inflammation of the lungs. In this project, you’re challenged to build an algorithm to detect a visual signal for pneumonia in medical images. Specifically, your algorithm needs to automatically locate lung opacities on chest radiographs.

Objectives

The objective of the project is,

  • Learn to how to do build an Object Detection Model
  • Use transfer learning to fine-tune a model.
  • Learn to
  • Read different research papers of given domain to obtain the knowledge of advanced models for the given problem.

Import the Libraries

In [1]:
pip install segmentation_models
Requirement already satisfied: segmentation_models in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (1.0.1)
Requirement already satisfied: efficientnet==1.0.0 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from segmentation_models) (1.0.0)
Requirement already satisfied: keras-applications<=1.0.8,>=1.0.7 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from segmentation_models) (1.0.8)
Requirement already satisfied: image-classifiers==1.0.0 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from segmentation_models) (1.0.0)
Requirement already satisfied: scikit-image in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from efficientnet==1.0.0->segmentation_models) (0.17.2)
Requirement already satisfied: h5py in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from keras-applications<=1.0.8,>=1.0.7->segmentation_models) (2.10.0)
Requirement already satisfied: numpy>=1.9.1 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from keras-applications<=1.0.8,>=1.0.7->segmentation_models) (1.19.1)
Requirement already satisfied: networkx>=2.0 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (2.5)
Requirement already satisfied: matplotlib!=3.0.0,>=2.0.0 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (3.3.1)
Requirement already satisfied: PyWavelets>=1.1.1 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (1.1.1)
Requirement already satisfied: scipy>=1.0.1 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (1.5.2)
Requirement already satisfied: imageio>=2.3.0 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (2.9.0)
Requirement already satisfied: tifffile>=2019.7.26 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (2020.9.3)
Requirement already satisfied: pillow!=7.1.0,!=7.1.1,>=4.3.0 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from scikit-image->efficientnet==1.0.0->segmentation_models) (7.2.0)
Requirement already satisfied: six in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from h5py->keras-applications<=1.0.8,>=1.0.7->segmentation_models) (1.15.0)
Requirement already satisfied: decorator>=4.3.0 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from networkx>=2.0->scikit-image->efficientnet==1.0.0->segmentation_models) (4.4.2)
Requirement already satisfied: python-dateutil>=2.1 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (2.8.1)
Requirement already satisfied: cycler>=0.10 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (0.10.0)
Requirement already satisfied: certifi>=2020.06.20 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (2020.6.20)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.3 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (2.4.7)
Requirement already satisfied: kiwisolver>=1.0.1 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0->segmentation_models) (1.2.0)
Note: you may need to restart the kernel to use updated packages.
In [2]:
pip install pydot
Requirement already satisfied: pydot in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (1.4.1)
Requirement already satisfied: pyparsing>=2.1.4 in /opt/anaconda3/envs/tensorflow/lib/python3.6/site-packages (from pydot) (2.4.7)
Note: you may need to restart the kernel to use updated packages.
In [3]:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import pylab as pl
import tensorflow as tf
import os
import pydicom
import cv2
from glob import glob

from matplotlib.patches import Rectangle
import tensorflow as tf
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.applications.mobilenet import preprocess_input

#Mobile Net
from tensorflow.keras.applications.mobilenet import MobileNet
from tensorflow.keras.layers import Reshape, UpSampling2D, Input, Concatenate, Conv2D,Dense,Activation, BatchNormalization, SpatialDropout2D
from tensorflow.keras.models import Model,Sequential
import segmentation_models
from segmentation_models.losses import bce_jaccard_loss
from segmentation_models.metrics import iou_score

from tensorflow.keras.models import Model # to join the two models
from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau
from sklearn.model_selection import train_test_split
from tensorflow.keras.optimizers import Adam

# to define loss
from tensorflow.keras.losses import binary_crossentropy
from tensorflow.keras.backend import log, epsilon

from tensorflow.keras.applications import VGG16
from tensorflow.keras.models import Model, Sequential
from tensorflow.keras.layers import Input, Dense, Flatten, Dropout, BatchNormalization
from tensorflow.keras.layers import Conv2D, SeparableConv2D, MaxPool2D, LeakyReLU, Activation
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.callbacks import ModelCheckpoint, ReduceLROnPlateau, EarlyStopping
from tensorflow.keras.optimizers import Adam,SGD, RMSprop
from tensorflow.keras.losses import binary_crossentropy
from tqdm import tqdm_notebook
from tqdm import tqdm
import seaborn as sns
import math
import random
from keras.utils.vis_utils import plot_model
from sklearn.utils import resample


from sklearn.metrics import roc_auc_score, roc_curve, classification_report, confusion_matrix
from tensorflow.keras.callbacks import CSVLogger, ModelCheckpoint, ReduceLROnPlateau
from tensorflow.keras.layers import Dense, Dropout, Input, GlobalAveragePooling2D
from tensorflow.keras.applications.densenet import preprocess_input
from tensorflow.keras.applications import DenseNet121
from tensorflow.keras.applications import ResNet50
from tensorflow.keras.applications import InceptionV3
from tensorflow.keras.applications import DenseNet169
from tensorflow.keras.applications import VGG19
#from skimage.transform import resize

import keras
import math
random_state = 2020


# Ignore the warnings
import warnings
warnings.filterwarnings("ignore")
Using TensorFlow backend.
Segmentation Models: using `keras` framework.

Image Properties Below

Change to the current project directory

In [4]:
os.chdir('/Volumes/Ayon_Drive/GreatLearning/Capstone_Pneumonia/')

Name of training and test images and bounding box details below

In [5]:
DET_CLASS_INFO = 'stage_2_detailed_class_info.csv'
TRAIN_BBOX = 'stage_2_train_labels.csv'
TRAIN_IMG_DCM = "stage_2_train_images"
TEST_IMG_DCM = "stage_2_test_images"
TRAIN_IMG_DIR_JPG = 'JPG_train'
TEST_IMG_DIR_JPG = 'JPG_test'
nbrImages = 0.02

Pre-Processing, Data Visualization, EDA

Exploratory Data Analysis (EDA) Here as a part of EDA, we will:

Start with understanding of the data with a brief on train/test labels and respective class info Look at the first five rows of both the csvs (train and test) Identify how are classes and target distributed Check the number of patients with 1, 2, ... bounding boxes Read and extract metadata from dicom files Perform analysis on some of the features from dicom files Check some random images from the training dataset Draw insights from the data at various stages of EDA

Reading CSVs Images for the current stage in the stage_2_train_images and stage_2_test_images. Training data: stage_2_train_labels.csv stage_2_detailed_class_info.csv containing detailed information about the positive and negative classes in the training set

Loading detailed class info file

In [6]:
class_df = pd.read_csv(DET_CLASS_INFO)
In [7]:
print("\nClass dataframe has 30227 rows and 2 columns:")
class_df.shape
Class dataframe has 30227 rows and 2 columns:
Out[7]:
(30227, 2)
In [8]:
print("\nClass dataframe first 5 rows:")
class_df.head()
Class dataframe first 5 rows:
Out[8]:
patientId class
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 No Lung Opacity / Not Normal
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd No Lung Opacity / Not Normal
2 00322d4d-1c29-4943-afc9-b6754be640eb No Lung Opacity / Not Normal
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 Normal
4 00436515-870c-4b36-a041-de91049b9ab4 Lung Opacity
In [9]:
print('Total No of Patients in Class Info', class_df['patientId'].value_counts().shape[0])
Total No of Patients in Class Info 26684
In [10]:
print('Total distinct classes: ', class_df['class'].unique())
Total distinct classes:  ['No Lung Opacity / Not Normal' 'Normal' 'Lung Opacity']

We see there are 3 classes, Normal, Lung Opacity and No Lung Opacity/Not Normal

No Lung Opacity/Not Normal are cases that look like opacity but are not.

Check for duplicates in patient id

In [11]:
##Identify duplicates records in the data
dupes = class_df['patientId'].duplicated()
sum(dupes)
Out[11]:
3543

3543 patients have duplicates in class info

In [12]:
class_df.groupby('class').size().plot.bar(5, 10, color=['Orange', 'green', 'Indigo'])
Out[12]:
<AxesSubplot:xlabel='class'>

Load CSV file containing training set patientIds and labels (Bounding Boxes)

In [13]:
labels_df = pd.read_csv(TRAIN_BBOX)
In [14]:
print("\nLabel dataframe first 5 rows:")
labels_df.head()
Label dataframe first 5 rows:
Out[14]:
patientId x y width height Target
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 NaN NaN NaN NaN 0
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd NaN NaN NaN NaN 0
2 00322d4d-1c29-4943-afc9-b6754be640eb NaN NaN NaN NaN 0
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 NaN NaN NaN NaN 0
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1

We see patient ids, and bounding box is present in the dataset. 0 means No Pneumonia, 1 means Pneumonia

Bounding Box is not present when the patient doesnot have pneumonia, however 0 can mean No Lung Opacity/Not Normal¶

In [15]:
print(f'Train Labels dataframe has {labels_df.shape[0]} rows and {labels_df.shape[1]} columns')
Train Labels dataframe has 30227 rows and 6 columns

Converting Not a number to 0

In [16]:
labels_df['x'] = labels_df['x'].replace(np.nan, 0)
labels_df['y'] = labels_df['y'].replace(np.nan, 0)
labels_df['width'] = labels_df['width'].replace(np.nan, 0)
labels_df['height'] = labels_df['height'].replace(np.nan, 0)
In [17]:
print("\nUpdated data samples:")
labels_df.head()
Updated data samples:
Out[17]:
patientId x y width height Target
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1

There are 30,227 patient ids

Checking the dataset information

In [18]:
labels_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 30227 entries, 0 to 30226
Data columns (total 6 columns):
 #   Column     Non-Null Count  Dtype  
---  ------     --------------  -----  
 0   patientId  30227 non-null  object 
 1   x          30227 non-null  float64
 2   y          30227 non-null  float64
 3   width      30227 non-null  float64
 4   height     30227 non-null  float64
 5   Target     30227 non-null  int64  
dtypes: float64(4), int64(1), object(1)
memory usage: 1.4+ MB
In [19]:
labels_df.describe()
Out[19]:
x y width height Target
count 30227.000000 30227.000000 30227.000000 30227.000000 30227.000000
mean 124.561683 115.960962 69.060575 104.084825 0.316108
std 216.326397 190.012883 106.910496 176.932152 0.464963
min 0.000000 0.000000 0.000000 0.000000 0.000000
25% 0.000000 0.000000 0.000000 0.000000 0.000000
50% 0.000000 0.000000 0.000000 0.000000 0.000000
75% 193.000000 231.000000 169.000000 188.000000 1.000000
max 835.000000 881.000000 528.000000 942.000000 1.000000

Checking duplicates if it matches with class info file

In [20]:
##Identify duplicates records in the data
dupes = labels_df['patientId'].duplicated()
sum(dupes)
Out[20]:
3543

3543 patients have multiple X rays

Check for missing values

In [21]:
print(" \nCount total NaN at each column in the dataset : \n\n", 
      labels_df.isnull().sum())
 
Count total NaN at each column in the dataset : 

 patientId    0
x            0
y            0
width        0
height       0
Target       0
dtype: int64

From the above we see there are no null values

In [22]:
print('Lets check the distribution of `Target` and `class` column'); print('--'*40)
fig = plt.figure(figsize = (10, 6))
ax = fig.add_subplot(121)
g = (labels_df['Target'].value_counts()
    .plot(kind = 'pie', autopct = '%.0f%%', 
          labels = ['Negative', 'Pneumonia Evidence'], 
          colors = ['green', 'red'], 
          startangle = 90, 
          title = 'Distribution of Target', fontsize = 12)
    .set_ylabel(''))
ax = fig.add_subplot(122)
g = (class_df['class'].value_counts().sort_index(ascending = False)
    .plot(kind = 'pie', autopct = '%.0f%%', 
          colors = ['green', 'orange', 'red'], 
          startangle = 90, title = 'Distribution of Class', 
          fontsize = 12)
    .set_ylabel(''))
plt.tight_layout()
Lets check the distribution of `Target` and `class` column
--------------------------------------------------------------------------------

We will check the number of boxes for each patient

In [23]:
box_patient_df = labels_df.groupby('patientId').size().reset_index(name='boxes')
box_patient_df.groupby('boxes').size().reset_index(name='patients')
Out[23]:
boxes patients
0 1 23286
1 2 3266
2 3 119
3 4 13
In [24]:
labels_class_df = pd.merge(labels_df, class_df, how='inner', on='patientId')
print('Total Cases : ', labels_class_df.shape[0])
Total Cases :  37629

Since there are duplicate patient ids in both the datasets we see an increase in the number of rows

Instead of doing inner join lets try to concat both the datasets

In [25]:
print('Let\'s also check whether each patientId has only one type of class'); print('--'*40)
print('Yes, each patientId is associated with only {} class'.format(class_df.groupby(['patientId'])['class'].nunique().max()))

# Merge the two dataframes
train_class_df = pd.concat([labels_df, class_df['class']], axis = 1)
print('Shape of the dataset after the merge: {}'.format(train_class_df.shape))
Let's also check whether each patientId has only one type of class
--------------------------------------------------------------------------------
Yes, each patientId is associated with only 1 class
Shape of the dataset after the merge: (30227, 7)
In [26]:
train_class_df.head(5)
Out[26]:
patientId x y width height Target class
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity

Observations from the CSVs Based on analysis above, some of the observations:

Training data is having a set of patientIds and bounding boxes. Bounding boxes are defined as follows: x, y, width and height. There are multiple records for patients. Number of duplicates in patientID = 3,543. There is also a binary target column i.e. Target indicating there was evidence of pneumonia or no definitive evidence of pneumonia. Class label contains: No Lung Opacity/Not Normal, Normal and Lung Opacity. Chest examinations with Target = 1 i.e. ones with evidence of Pneumonia are associated with Lung Opacity class. Chest examinations with Target = 0 i.e. those with no definitive effidence of Pneumonia are either of Normal or No Lung Opacity / Not Normal class. About 23,286 patientIds (~87% of them) provided have 1 bounding boxes while 13 patients have 4 bounding boxes!!!!

Reading Images Images provided are stored in DICOM (.dcm) format which is an international standard to transmit, store, retrieve, print, process, and display medical imaging information. Digital Imaging and Communications in Medicine (DICOM) makes medical imaging information interoperable. We will make use of pydicom package here to read the images.

In [27]:
def checkXray(i, dirName):
    patientId = train_class_df['patientId'][i]
    print("Patient Id: ", patientId)
    fileName = dirName + "/" + patientId
    print("\nBounding Box Coordinates, X: ", train_class_df['x'][i])
    print("\nBounding Box Coordinates, Y: ", train_class_df['y'][i])
    print("\nBounding Box Coordinates, Width: ", train_class_df['width'][i])
    print("\nBounding Box Coordinates, Height: ", train_class_df['height'][i])
    
    patient_file = '%s.dcm' % fileName
    patient_data = pydicom.read_file(patient_file)
    print(patient_data)
    
    plt.imshow(patient_data.pixel_array,cmap=pl.cm.gist_gray)

Let's take a look into an image of a person who has normal lungs

In [28]:
checkXray(3, TRAIN_IMG_DCM)
Patient Id:  003d8fa0-6bf1-40ed-b54c-ac657f8495c5

Bounding Box Coordinates, X:  0.0

Bounding Box Coordinates, Y:  0.0

Bounding Box Coordinates, Width:  0.0

Bounding Box Coordinates, Height:  0.0
Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 200
(0002, 0001) File Meta Information Version       OB: b'\x00\x01'
(0002, 0002) Media Storage SOP Class UID         UI: Secondary Capture Image Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.276.0.7230010.3.1.4.8323329.2293.1517874295.733882
(0002, 0010) Transfer Syntax UID                 UI: JPEG Baseline (Process 1)
(0002, 0012) Implementation Class UID            UI: 1.2.276.0.7230010.3.0.3.6.0
(0002, 0013) Implementation Version Name         SH: 'OFFIS_DCMTK_360'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0016) SOP Class UID                       UI: Secondary Capture Image Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.276.0.7230010.3.1.4.8323329.2293.1517874295.733882
(0008, 0020) Study Date                          DA: '19010101'
(0008, 0030) Study Time                          TM: '000000.00'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'CR'
(0008, 0064) Conversion Type                     CS: 'WSD'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 103e) Series Description                  LO: 'view: PA'
(0010, 0010) Patient's Name                      PN: '003d8fa0-6bf1-40ed-b54c-ac657f8495c5'
(0010, 0020) Patient ID                          LO: '003d8fa0-6bf1-40ed-b54c-ac657f8495c5'
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: 'M'
(0010, 1010) Patient's Age                       AS: '28'
(0018, 0015) Body Part Examined                  CS: 'CHEST'
(0018, 5101) View Position                       CS: 'PA'
(0020, 000d) Study Instance UID                  UI: 1.2.276.0.7230010.3.1.2.8323329.2293.1517874295.733881
(0020, 000e) Series Instance UID                 UI: 1.2.276.0.7230010.3.1.3.8323329.2293.1517874295.733880
(0020, 0010) Study ID                            SH: ''
(0020, 0011) Series Number                       IS: "1"
(0020, 0013) Instance Number                     IS: "1"
(0020, 0020) Patient Orientation                 CS: ''
(0028, 0002) Samples per Pixel                   US: 1
(0028, 0004) Photometric Interpretation          CS: 'MONOCHROME2'
(0028, 0010) Rows                                US: 1024
(0028, 0011) Columns                             US: 1024
(0028, 0030) Pixel Spacing                       DS: [0.14300000000000002, 0.14300000000000002]
(0028, 0100) Bits Allocated                      US: 8
(0028, 0101) Bits Stored                         US: 8
(0028, 0102) High Bit                            US: 7
(0028, 0103) Pixel Representation                US: 0
(0028, 2110) Lossy Image Compression             CS: '01'
(0028, 2114) Lossy Image Compression Method      CS: 'ISO_10918_1'
(7fe0, 0010) Pixel Data                          OB: Array of 155284 elements

Let's take a look into an image of a person who has lung opacity

In [29]:
checkXray(4, TRAIN_IMG_DCM)
Patient Id:  00436515-870c-4b36-a041-de91049b9ab4

Bounding Box Coordinates, X:  264.0

Bounding Box Coordinates, Y:  152.0

Bounding Box Coordinates, Width:  213.0

Bounding Box Coordinates, Height:  379.0
Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 200
(0002, 0001) File Meta Information Version       OB: b'\x00\x01'
(0002, 0002) Media Storage SOP Class UID         UI: Secondary Capture Image Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.276.0.7230010.3.1.4.8323329.6379.1517874325.469569
(0002, 0010) Transfer Syntax UID                 UI: JPEG Baseline (Process 1)
(0002, 0012) Implementation Class UID            UI: 1.2.276.0.7230010.3.0.3.6.0
(0002, 0013) Implementation Version Name         SH: 'OFFIS_DCMTK_360'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0016) SOP Class UID                       UI: Secondary Capture Image Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.276.0.7230010.3.1.4.8323329.6379.1517874325.469569
(0008, 0020) Study Date                          DA: '19010101'
(0008, 0030) Study Time                          TM: '000000.00'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'CR'
(0008, 0064) Conversion Type                     CS: 'WSD'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 103e) Series Description                  LO: 'view: AP'
(0010, 0010) Patient's Name                      PN: '00436515-870c-4b36-a041-de91049b9ab4'
(0010, 0020) Patient ID                          LO: '00436515-870c-4b36-a041-de91049b9ab4'
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: 'F'
(0010, 1010) Patient's Age                       AS: '32'
(0018, 0015) Body Part Examined                  CS: 'CHEST'
(0018, 5101) View Position                       CS: 'AP'
(0020, 000d) Study Instance UID                  UI: 1.2.276.0.7230010.3.1.2.8323329.6379.1517874325.469568
(0020, 000e) Series Instance UID                 UI: 1.2.276.0.7230010.3.1.3.8323329.6379.1517874325.469567
(0020, 0010) Study ID                            SH: ''
(0020, 0011) Series Number                       IS: "1"
(0020, 0013) Instance Number                     IS: "1"
(0020, 0020) Patient Orientation                 CS: ''
(0028, 0002) Samples per Pixel                   US: 1
(0028, 0004) Photometric Interpretation          CS: 'MONOCHROME2'
(0028, 0010) Rows                                US: 1024
(0028, 0011) Columns                             US: 1024
(0028, 0030) Pixel Spacing                       DS: [0.139, 0.139]
(0028, 0100) Bits Allocated                      US: 8
(0028, 0101) Bits Stored                         US: 8
(0028, 0102) High Bit                            US: 7
(0028, 0103) Pixel Representation                US: 0
(0028, 2110) Lossy Image Compression             CS: '01'
(0028, 2114) Lossy Image Compression Method      CS: 'ISO_10918_1'
(7fe0, 0010) Pixel Data                          OB: Array of 119382 elements

Let's take a look into an image of a person who has No Lung Opacity/Not Normal¶

In [30]:
checkXray(0, TRAIN_IMG_DCM)
Patient Id:  0004cfab-14fd-4e49-80ba-63a80b6bddd6

Bounding Box Coordinates, X:  0.0

Bounding Box Coordinates, Y:  0.0

Bounding Box Coordinates, Width:  0.0

Bounding Box Coordinates, Height:  0.0
Dataset.file_meta -------------------------------
(0002, 0000) File Meta Information Group Length  UL: 202
(0002, 0001) File Meta Information Version       OB: b'\x00\x01'
(0002, 0002) Media Storage SOP Class UID         UI: Secondary Capture Image Storage
(0002, 0003) Media Storage SOP Instance UID      UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526
(0002, 0010) Transfer Syntax UID                 UI: JPEG Baseline (Process 1)
(0002, 0012) Implementation Class UID            UI: 1.2.276.0.7230010.3.0.3.6.0
(0002, 0013) Implementation Version Name         SH: 'OFFIS_DCMTK_360'
-------------------------------------------------
(0008, 0005) Specific Character Set              CS: 'ISO_IR 100'
(0008, 0016) SOP Class UID                       UI: Secondary Capture Image Storage
(0008, 0018) SOP Instance UID                    UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526
(0008, 0020) Study Date                          DA: '19010101'
(0008, 0030) Study Time                          TM: '000000.00'
(0008, 0050) Accession Number                    SH: ''
(0008, 0060) Modality                            CS: 'CR'
(0008, 0064) Conversion Type                     CS: 'WSD'
(0008, 0090) Referring Physician's Name          PN: ''
(0008, 103e) Series Description                  LO: 'view: PA'
(0010, 0010) Patient's Name                      PN: '0004cfab-14fd-4e49-80ba-63a80b6bddd6'
(0010, 0020) Patient ID                          LO: '0004cfab-14fd-4e49-80ba-63a80b6bddd6'
(0010, 0030) Patient's Birth Date                DA: ''
(0010, 0040) Patient's Sex                       CS: 'F'
(0010, 1010) Patient's Age                       AS: '51'
(0018, 0015) Body Part Examined                  CS: 'CHEST'
(0018, 5101) View Position                       CS: 'PA'
(0020, 000d) Study Instance UID                  UI: 1.2.276.0.7230010.3.1.2.8323329.28530.1517874485.775525
(0020, 000e) Series Instance UID                 UI: 1.2.276.0.7230010.3.1.3.8323329.28530.1517874485.775524
(0020, 0010) Study ID                            SH: ''
(0020, 0011) Series Number                       IS: "1"
(0020, 0013) Instance Number                     IS: "1"
(0020, 0020) Patient Orientation                 CS: ''
(0028, 0002) Samples per Pixel                   US: 1
(0028, 0004) Photometric Interpretation          CS: 'MONOCHROME2'
(0028, 0010) Rows                                US: 1024
(0028, 0011) Columns                             US: 1024
(0028, 0030) Pixel Spacing                       DS: [0.14300000000000002, 0.14300000000000002]
(0028, 0100) Bits Allocated                      US: 8
(0028, 0101) Bits Stored                         US: 8
(0028, 0102) High Bit                            US: 7
(0028, 0103) Pixel Representation                US: 0
(0028, 2110) Lossy Image Compression             CS: '01'
(0028, 2114) Lossy Image Compression Method      CS: 'ISO_10918_1'
(7fe0, 0010) Pixel Data                          OB: Array of 142006 elements
In [31]:
# Helper function to get additional features from dicom images
def get_tags(data, path):
    images = os.listdir(path)
    for _, name in tqdm_notebook(enumerate(images)):
        img_path = os.path.join(path, name)
        img_data = pydicom.read_file(img_path)
        idx = (data['patientId'] == img_data.PatientID)
        data.loc[idx,'PatientSex'] = img_data.PatientSex
        data.loc[idx,'PatientAge'] = pd.to_numeric(img_data.PatientAge)
        data.loc[idx,'BodyPartExamined'] = img_data.BodyPartExamined
        data.loc[idx,'ViewPosition'] = img_data.ViewPosition
        data.loc[idx,'Modality'] = img_data.Modality
In [32]:
print('Read the training images file names and path'); print('--'*40)
images = pd.DataFrame({'path': glob(os.path.join(TRAIN_IMG_DCM, '*.dcm'))})
images['patientId'] = images['path'].map(lambda x: os.path.splitext(os.path.basename(x))[0])
print('Number of images in the training folder: {}'.format(images.shape[0]))
print('Columns in the training images dataframe: {}'.format(list(images.columns)))
assert images.shape[0] == len(list(set(train_class_df['patientId']))), 'Number of training images should be equal to the unique patientIds we have'
Read the training images file names and path
--------------------------------------------------------------------------------
Number of images in the training folder: 26684
Columns in the training images dataframe: ['path', 'patientId']
In [33]:
print('Merge path from the `images` dataframe with `train_class` dataframe'); print('--'*40)
train_class_df = train_class_df.merge(images, on = 'patientId', how = 'left')
print('Shape of the `train_class` dataframe after merge: {}'.format(train_class_df.shape))
Merge path from the `images` dataframe with `train_class` dataframe
--------------------------------------------------------------------------------
Shape of the `train_class` dataframe after merge: (30227, 8)
In [34]:
train_class_df.head()
Out[34]:
patientId x y width height Target class path
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6...
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c...
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b...
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a...
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d...
In [35]:
print('Get features such as {} from training images'.format(('PatientSex', 'PatientAge', 'BodyPartExamined', 'ViewPosition', 'Modality')))
if os.path.isfile('train_feature_engineered.pkl') :
    print('File exists..So we have the data')
else:
    get_tags(train_class_df, TRAIN_IMG_DCM)
    train_class_df.to_pickle('train_feature_engineered.pkl')

print('Saving the feature engineered dataframe for future use'); print('--'*40)
Get features such as ('PatientSex', 'PatientAge', 'BodyPartExamined', 'ViewPosition', 'Modality') from training images
File exists..So we have the data
Saving the feature engineered dataframe for future use
--------------------------------------------------------------------------------
In [36]:
train_class_df = pd.read_pickle('train_feature_engineered.pkl')
train_class_df.shape
Out[36]:
(30227, 13)
In [37]:
train_class_df.head()
Out[37]:
patientId x y width height Target class path PatientSex PatientAge BodyPartExamined ViewPosition Modality
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6... F 51.0 CHEST PA CR
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c... F 48.0 CHEST PA CR
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b... M 19.0 CHEST AP CR
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a... M 28.0 CHEST PA CR
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d... F 32.0 CHEST AP CR
In [38]:
print('As expected unique in `BodyPartExamined` is: {}'.format(train_class_df['BodyPartExamined'].unique()[0]))
print('Unique in `Modality` is: {}'.format(train_class_df['Modality'].unique()[0])); print('--'*40)
As expected unique in `BodyPartExamined` is: CHEST
Unique in `Modality` is: CR
--------------------------------------------------------------------------------
In [39]:
print('Overall the distribution is almost equal for `ViewPosition` but where there\'s a Pneumonia Evidence, `ViewPosition` is `AP`')
print('AP: Anterior/Posterior, PA: Posterior/Anterior'); print('--'*40)
fig = plt.figure(figsize = (10, 6))
ax = fig.add_subplot(121)
g = (train_class_df['ViewPosition'].value_counts()
    .plot(kind = 'pie', autopct = '%.0f%%',  
          startangle = 90,
          title = 'Distribution of ViewPosition, Overall', 
          fontsize = 12)
    .set_ylabel(''))
ax = fig.add_subplot(122)
g = (train_class_df.loc[train_class_df['Target'] == 1, 'ViewPosition']
     .value_counts().sort_index(ascending = False)
    .plot(kind = 'pie', autopct = '%.0f%%', 
          startangle = 90, counterclock = False, 
          title = 'Distribution of ViewPosition, Pneumonia Evidence', 
          fontsize = 12)
    .set_ylabel(''))
Overall the distribution is almost equal for `ViewPosition` but where there's a Pneumonia Evidence, `ViewPosition` is `AP`
AP: Anterior/Posterior, PA: Posterior/Anterior
--------------------------------------------------------------------------------
In [40]:
print('Plot x and y centers of bounding boxes'); print('--'*40)
# Creating a dataframe with columns for center of the rectangles
bboxes = train_class_df[train_class_df['Target'] == 1]
bboxes['xw'] = bboxes['x'] + bboxes['width'] / 2
bboxes['yh'] = bboxes['y'] + bboxes['height'] / 2

g = sns.jointplot(x = bboxes['xw'], y = bboxes['yh'], data = bboxes, 
                  kind = 'hex', alpha = 0.5, size = 8)
plt.suptitle('Bounding Boxes Location, Pneumonia Evidence')
plt.tight_layout()
plt.subplots_adjust(top = 0.95)
plt.show()
Plot x and y centers of bounding boxes
--------------------------------------------------------------------------------
In [41]:
# Helper function to plot bboxes scatter
# Reference for this function & plots: https://www.kaggle.com/gpreda/rsna-pneumonia-detection-eda
def bboxes_scatter(df1, df2, text1, text2):
    fig, (ax1, ax2) = plt.subplots(1, 2, figsize = (13, 8))
    fig.subplots_adjust(top = 0.85)
    fig.suptitle('Plotting centers of lung opacity\n{} & {}'.format(text1, text2))
    df1.plot.scatter(x = 'xw', y = 'yh', ax = ax1, alpha = 0.8, marker = '.', 
                   xlim = (0, 1024), ylim = (0, 1024), color = 'green')
    ax1.set_title('Centers of Lung Opacity\n{}'.format(text1))
    for i, row in df1.iterrows():
        ax1.add_patch(Rectangle(xy = (row['x'], row['y']),
                            width = row['width'], height = row['height'], 
                            alpha = 3.5e-3, color = 'yellow'))
    plt.title('Centers of Lung Opacity\n{}'.format(text2))
    df2.plot.scatter(x = 'xw', y = 'yh', ax = ax2, alpha = 0.8, marker = '.',
                   color = 'brown',  xlim = (0, 1024), ylim = (0, 1024))
    ax2.set_title('Centers of Lung Opacity\n{}'.format(text2))
    for i, row in df2.iterrows():
        ax2.add_patch(Rectangle(xy = (row['x'], row['y']),
                             width = row['width'], height = row['height'],
                             alpha = 3.5e-3, 
                             color = 'yellow'))
    plt.show()
In [42]:
print('Exploring the bounding boxes centers for `ViewPositions` for random sample = 1000')

df1 = bboxes[bboxes['ViewPosition'] == 'PA'].sample(1000)
df2 = bboxes[bboxes['ViewPosition'] == 'AP'].sample(1000)
bboxes_scatter(df1, df2, 'View Position = PA', 'View Position = AP')
Exploring the bounding boxes centers for `ViewPositions` for random sample = 1000

Observations: BodyPartExamined & ViewPosition Above we saw,

BodyPartExamined is unique for all cases and is CHEST in the training dataset and that was also expected. Unique in Modality is CR i.e. Computer Radiography Overall ViewPosition is almost equally distributed in the training dataset but for cases where Target=1, most of the view position are AP.

In [43]:
print('Checking outliers in `PatientAge'); print('--'*40)
print('Minimum `PatientAge` in the training dataset: {}'.format(train_class_df['PatientAge'].min()))
print('Maximum `PatientAge` in the training dataset: {}'.format(train_class_df['PatientAge'].max()))
print('75th Percentile of `PatientAge` in the training dataset: {}'.format(train_class_df['PatientAge'].quantile(0.75)))
print('`PatientAge` in upper whisker for box plot: {}'.format(train_class_df['PatientAge'].quantile(0.75) + (train_class_df['PatientAge'].quantile(0.75) - train_class_df['PatientAge'].quantile(0.25))))
print()
fig = plt.figure(figsize = (10, 6))
ax = sns.boxplot(data = train_class_df['PatientAge'], orient = 'h').set_title('Outliers in PatientAge')
Checking outliers in `PatientAge
--------------------------------------------------------------------------------
Minimum `PatientAge` in the training dataset: 1.0
Maximum `PatientAge` in the training dataset: 155.0
75th Percentile of `PatientAge` in the training dataset: 59.0
`PatientAge` in upper whisker for box plot: 84.0

In [44]:
print('Using pd.clip to set upper threshold of 100 for age and remove outliers'); print('--'*40)
train_class_df['PatientAge'] = train_class_df['PatientAge'].clip(train_class_df['PatientAge'].min(), 100)
train_class_df['PatientAge'].describe().astype(int)
Using pd.clip to set upper threshold of 100 for age and remove outliers
--------------------------------------------------------------------------------
Out[44]:
count    30227
mean        46
std         16
min          1
25%         34
50%         49
75%         59
max        100
Name: PatientAge, dtype: int64
In [45]:
print('Get the distribution of `PatientAge` overall and where Target = 1'); print('--'*40)
fig = plt.figure(figsize = (10, 6))
ax = fig.add_subplot(121)
g = (sns.distplot(train_class_df['PatientAge'])
    .set_title('Distribution of PatientAge, Overall'))
ax = fig.add_subplot(122)
g = (sns.distplot(train_class_df.loc[train_class_df['Target'] == 1, 'PatientAge'])
    .set_title('Distribution of PatientAge, Pneumonia Evidence'))
Get the distribution of `PatientAge` overall and where Target = 1
--------------------------------------------------------------------------------

Using Binning Method for PatientAge feature

We'll make use of a pd.cut which is 'Bin values into discrete intervals'. Use of this method is recommended when need is to segment and sort data values into bins. This function is also useful for going from a continuous variable to a categorical variable. Supports binning into an equal number of bins, or a pre-specified array of bins.

In [46]:
print('Creating Age Binning field', '--'*40)
train_class_df['AgeBins'] = pd.cut(train_class_df['PatientAge'], bins = 4, precision = 0, labels = ['<=26', '<=50', '<=75', '<=100'])
train_class_df['AgeBins'].value_counts()
Creating Age Binning field --------------------------------------------------------------------------------
Out[46]:
<=75     13318
<=50     12157
<=26      3972
<=100      780
Name: AgeBins, dtype: int64
In [47]:
print('Value counts of the age bin field created'); print('--'*40)
display(pd.concat([train_class_df['AgeBins'].value_counts().sort_index().rename('Counts of Age Bins, Overall'), 
                   train_class_df.loc[train_class_df['Target'] == 1, 'AgeBins'].value_counts().sort_index().rename('Counts of Age Bins, Target=1')], axis = 1))
print()
f, (ax1, ax2) = plt.subplots(1, 2, figsize = (10, 6))
g = sns.countplot(x = train_class_df['AgeBins'], ax = ax1).set_title('Count Plot of Age Bins, Overall')
g = sns.countplot(x = train_class_df.loc[train_class_df['Target'] == 1, 'AgeBins'], ax = ax2).set_title('Count Plot of Age Bins, Pneumonia Evidence')
plt.tight_layout()
Value counts of the age bin field created
--------------------------------------------------------------------------------
Counts of Age Bins, Overall Counts of Age Bins, Target=1
<=26 3972 1478
<=50 12157 3917
<=75 13318 3895
<=100 780 265

In [48]:
print('Exploring the bounding boxes centers for `AgeBins` for random sample = 200')
# Creating a dataframe with columns for center of the rectangles
bboxes = train_class_df[train_class_df['Target'] == 1]
bboxes['xw'] = bboxes['x'] + bboxes['width'] / 2
bboxes['yh'] = bboxes['y'] + bboxes['height'] / 2

df1 = bboxes[bboxes['AgeBins'] == '<=26'].sample(200)
df2 = bboxes[bboxes['AgeBins'] == '<=100'].sample(200)
bboxes_scatter(df1, df2, '1 < AgeBins < 26 (Lower Bin)', '76 < AgeBins < 100 (Upper Bin)')
Exploring the bounding boxes centers for `AgeBins` for random sample = 200
In [49]:
print('Checking distribution of age for those with Pneumonia Evidence, by Gender & Count Plot of Gender'); print('--'*40)
display(pd.concat([train_class_df['PatientSex'].value_counts(normalize = True).round(2).sort_values().rename('% Gender, Overall'), 
                   train_class_df.loc[(train_class_df['Target'] == 1), 'PatientSex']
                   .value_counts(normalize = True).round(2).sort_index().rename('% Gender, Target=1')], axis = 1))

f, ((ax1, ax2), (ax3, ax4)) = plt.subplots(2, 2, figsize = (10, 10))
g = sns.distplot(train_class_df.loc[(train_class_df['Target'] == 1) & (train_class_df['PatientSex'] == 'M'), 'PatientAge'], ax = ax1).set_title('Distribution of Age for Male, Pneumonia Evidence')
g = sns.distplot(train_class_df.loc[(train_class_df['Target'] == 1) & (train_class_df['PatientSex'] == 'F'), 'PatientAge'], ax = ax2).set_title('Distribution of Age for Female, Pneumonia Evidence')
g = sns.countplot(y = train_class_df['PatientSex'], ax = ax3, palette = 'PuOr').set_title('Count Plot of Gender, Overall')
g = sns.countplot(y = train_class_df.loc[(train_class_df['Target'] == 1), 'PatientSex'], ax = ax4, palette = 'PuOr').set_title('Count Plot of Gender, Pneumonia Evidence')
plt.tight_layout()
Checking distribution of age for those with Pneumonia Evidence, by Gender & Count Plot of Gender
--------------------------------------------------------------------------------
% Gender, Overall % Gender, Target=1
F 0.43 0.42
M 0.57 0.58
In [50]:
print('Exploring the bounding boxes centers for `PatientSex` for random sample = 1000')
df1 = bboxes[bboxes['PatientSex'] == 'M'].sample(1000)
df2 = bboxes[bboxes['PatientSex'] == 'F'].sample(1000)
bboxes_scatter(df1, df2, 'PatientSex = M', 'PatientSex = F')
Exploring the bounding boxes centers for `PatientSex` for random sample = 1000

Observations: PatientAge & PatientSex Above we saw,

For PatientAge we saw the distribution for both overall and where there were evidence of Pneumonia. Used binning to check the count of age bins. Count was highest for age group 40-78 both overall and with Pneumonia Evidence. Saw distribution of age for Male and Female with Pneumonia Evidence. Dataset had more Males (57%-58%) than Females (42%-43%). Only PatientAge, PatientSex and ViewPosition are useful features from metadata.

Dropping the other features from train_class dataframe and save that as a pickle file

In [51]:
train_class_df.drop(['BodyPartExamined', 'Modality', 'AgeBins'], inplace = True, axis = 1)
train_class_df.to_pickle('train_class_features.pkl')
display(train_class_df.shape, train_class_df.head())
(30227, 11)
patientId x y width height Target class path PatientSex PatientAge ViewPosition
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6... F 51.0 PA
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c... F 48.0 PA
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b... M 19.0 AP
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a... M 28.0 PA
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d... F 32.0 AP
In [52]:
print('Checking sample for different classes')
sample1 = train_class_df.loc[train_class_df['class'] == 'Normal'].iloc[0]
sample2 = train_class_df.loc[train_class_df['class'] == 'No Lung Opacity / Not Normal'].iloc[0]
sample3 = train_class_df.loc[train_class_df['class'] == 'Lung Opacity'].iloc[1]
ds1 = pydicom.dcmread(sample1['path'])
ds2 = pydicom.dcmread(sample2['path'])
ds3 = pydicom.dcmread(sample3['path'])

f, ((ax1, ax2, ax3)) = plt.subplots(1, 3, figsize = (15, 8))
ax1.imshow(ds1.pixel_array, cmap = plt.cm.bone)
ax1.set_title('Class = Normal')
ax1.axis('off')
ax2.imshow(ds2.pixel_array, cmap = plt.cm.bone)
ax2.set_title('Class = No Lung Opacity / Not Normal')
ax2.axis('off')
ax3.imshow(ds3.pixel_array, cmap = plt.cm.bone)
ax3.set_title('Class = Lung Opacity')
ax3.axis('off')
plt.show()
Checking sample for different classes
In [53]:
sample4 = train_class_df.loc[(train_class_df['ViewPosition'] == 'AP')].iloc[0]
sample5 = train_class_df.loc[(train_class_df['ViewPosition'] == 'PA')].iloc[0]
ds4 = pydicom.dcmread(sample4['path'])
ds5 = pydicom.dcmread(sample5['path'])

f, ((ax1, ax2)) = plt.subplots(1, 2, figsize = (15, 8))
ax1.imshow(ds4.pixel_array, cmap = plt.cm.bone)
ax1.set_title('View Position = AP')
ax1.axis('off')
ax2.imshow(ds5.pixel_array, cmap = plt.cm.bone)
ax2.set_title('View Position = PA')
ax2.axis('off')
plt.show()
In [54]:
# Helper function to plot the dicom images
def plot_dicom_images(data, df, img_path):
    img_data = list(data.T.to_dict().values())
    f, ax = plt.subplots(3, 3, figsize = (16, 18))
    for i, row in enumerate(img_data):
        image = row['patientId'] + '.dcm'
        path = os.path.join(img_path, image)
        data = pydicom.read_file(path)
        rows = df[df['patientId'] == row['patientId']]
        age = rows.PatientAge.unique().tolist()[0]
        sex = data.PatientSex
        part = data.BodyPartExamined
        vp = data.ViewPosition
        modality = data.Modality
        data_img = pydicom.dcmread(path)
        ax[i//3, i%3].imshow(data_img.pixel_array, cmap = plt.cm.bone)
        ax[i//3, i%3].axis('off')
        ax[i//3, i%3].set_title('ID: {}\nAge: {}, Sex: {}, Part: {}, VP: {}, Modality: {}\nTarget: {}, Class: {}\nWindow: {}:{}:{}:{}'\
                              .format(row['patientId'], age, sex, part, 
                                      vp, modality, row['Target'], 
                                      row['class'], row['x'], 
                                      row['y'], row['width'],
                                      row['height']))
        box_data = list(rows.T.to_dict().values())
        for j, row in enumerate(box_data):
            ax[i//3, i%3].add_patch(Rectangle(xy = (row['x'], row['y']),
                      width = row['width'], height = row['height'], 
                      color = 'blue', alpha = 0.15)) 
    plt.show()
In [55]:
# this function is a part of custom module imported earlier (`eda`)
plot_dicom_images(data = train_class_df.loc[(train_class_df['Target'] == 1)].sample(9), 
                  df = train_class_df, img_path = TRAIN_IMG_DCM)

Now we will convert the images from dcm to jpg for faster processing of data

In [56]:
def convertImage(folder_path, jpg_folder_path):
    if not os.path.exists(jpg_folder_path): 
        os.makedirs(jpg_folder_path)
    images_path = os.listdir(folder_path)
    for n, image in tqdm_notebook(enumerate(images_path)):
        ds = pydicom.dcmread(os.path.join(folder_path, image))
        pixel_array_numpy = ds.pixel_array
        image = image.replace('.dcm', '.jpg')
        cv2.imwrite(os.path.join(jpg_folder_path, image), pixel_array_numpy)

Convert all training images

In [57]:
if os.listdir(TRAIN_IMG_DIR_JPG) == []: 
    print("No files found in the directory.") 
    convertImage(TRAIN_IMG_DCM, TRAIN_IMG_DIR_JPG)

Convert all test images

In [58]:
if os.listdir(TRAIN_IMG_DIR_JPG) == []: 
    print("No files found in the directory.") 
    convertImage(TEST_IMG_DCM, TEST_IMG_DIR_JPG)

Read data and Preparation for Model

In [59]:
ALPHA = 1
IMAGE_SIZE = 1024
IMAGE_HEIGHT = 224
IMAGE_WIDTH = 224
In [60]:
train_class_df.head(5)
Out[60]:
patientId x y width height Target class path PatientSex PatientAge ViewPosition
0 0004cfab-14fd-4e49-80ba-63a80b6bddd6 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/0004cfab-14fd-4e49-80ba-6... F 51.0 PA
1 00313ee0-9eaa-42f4-b0ab-c148ed3241cd 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00313ee0-9eaa-42f4-b0ab-c... F 48.0 PA
2 00322d4d-1c29-4943-afc9-b6754be640eb 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/00322d4d-1c29-4943-afc9-b... M 19.0 AP
3 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/003d8fa0-6bf1-40ed-b54c-a... M 28.0 PA
4 00436515-870c-4b36-a041-de91049b9ab4 264.0 152.0 213.0 379.0 1 Lung Opacity stage_2_train_images/00436515-870c-4b36-a041-d... F 32.0 AP
In [61]:
train_class_df['x2']=train_class_df['x'] + train_class_df['width']
train_class_df['y2']=train_class_df['y'] + train_class_df['height']
train_class_df.rename(columns = {'x':'x1'}, inplace = True)
train_class_df.rename(columns = {'y':'y1'}, inplace = True)
In [62]:
train_class_reduced_df = train_class_df.sample(frac = nbrImages) 
In [63]:
train_class_reduced_df.shape
Out[63]:
(605, 13)
In [64]:
train_class_reduced_df.head()
Out[64]:
patientId x1 y1 width height Target class path PatientSex PatientAge ViewPosition x2 y2
29049 0257438d-8365-4000-935e-3de03114734e 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/0257438d-8365-4000-935e-3... M 35.0 PA 0.0 0.0
9655 6946f198-8ee1-4a54-b6eb-3390f3c43554 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/6946f198-8ee1-4a54-b6eb-3... M 47.0 PA 0.0 0.0
10620 70a5da55-11c3-428d-b307-1a4bfe0124f7 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/70a5da55-11c3-428d-b307-1... M 51.0 PA 0.0 0.0
1446 16164b86-ce73-4531-96d0-d6bdd2a96c94 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/16164b86-ce73-4531-96d0-d... M 58.0 PA 0.0 0.0
7133 547b5577-865c-45c5-af67-9c239e5d4eab 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/547b5577-865c-45c5-af67-9... M 42.0 AP 0.0 0.0
In [65]:
def dropFeatures(train_class_updt_df):
    train_class_reduced_df = train_class_updt_df[['path', 'x1', 'y1','x2','y2','Target']].copy(deep = True)
    train_class_reduced_df['path'] = (train_class_reduced_df['path']
                                 .str.replace('stage_2_train_images', 'JPG_train')
                                 .str.replace('.dcm', '.jpg'))
    print('Distribution of target in the training set:'); 
    display(pd.Series(train_class_reduced_df['Target']).value_counts())
    train_class_reduced_df = train_class_reduced_df.reset_index()
    return train_class_reduced_df
In [66]:
train_class_updt_df = dropFeatures(train_class_reduced_df)
Distribution of target in the training set:
0    411
1    194
Name: Target, dtype: int64

We need to handle imbalance hence we are going to delete non pneumonic person from the dataset

In [67]:
def downSample(train_class_updt_df):
    df_majority = train_class_updt_df[train_class_updt_df.Target==0]
    df_minority = train_class_updt_df[train_class_updt_df.Target==1]
    # Downsample majority class
    df_majority_downsampled = resample(df_majority, 
                                     replace=False,    # sample without replacement
                                     n_samples=df_minority.shape[0],     # to match minority class
                                     random_state=0) # reproducible results

    # Combine minority class with downsampled majority class
    df_majority_downsampled = pd.concat([df_majority_downsampled, df_minority])

    # Display new class counts
    display(df_majority_downsampled.Target.value_counts())
    df_majority_downsampled = df_majority_downsampled.reset_index()
    return df_majority_downsampled
In [68]:
train_class_reduced_df = downSample(train_class_updt_df)
1    194
0    194
Name: Target, dtype: int64
In [69]:
train_class_reduced_df.head(5)
Out[69]:
level_0 index path x1 y1 x2 y2 Target
0 438 10061 JPG_train/6c324073-374c-4319-9399-6e40fbe60b48... 0.0 0.0 0.0 0.0 0
1 118 14617 JPG_train/901a4e14-0abc-4bb9-a29b-52fc0491847f... 0.0 0.0 0.0 0.0 0
2 500 15267 JPG_train/95730185-57e4-4c2b-83af-5e4da5a3b66c... 0.0 0.0 0.0 0.0 0
3 523 25483 JPG_train/e44e3036-d041-4492-aec0-f69180e567fe... 0.0 0.0 0.0 0.0 0
4 444 3205 JPG_train/36ef9113-5ea4-4812-a5ea-fdb0e3e028ba... 0.0 0.0 0.0 0.0 0
In [70]:
def load_image(path):
    img = cv2.imread(path, 1)
    # OpenCV loads images with color channels
    # in BGR order. So we need to reverse them
    return img[...,::-1]
In [71]:
def maskImage(train_class_updt_df):
    masks = np.zeros((int(train_class_updt_df.shape[0]), IMAGE_HEIGHT, IMAGE_WIDTH))
    X = np.zeros((int(train_class_updt_df.shape[0]), IMAGE_HEIGHT, IMAGE_WIDTH, 3))
    for index in tqdm_notebook(range(train_class_updt_df.shape[0])):
        img = load_image(train_class_updt_df['path'][index])
        img = cv2.resize(img, dsize=(IMAGE_HEIGHT, IMAGE_WIDTH), interpolation=cv2.INTER_CUBIC)
        try:
          img = img[:, :, :3]
        except:
          continue
        x1 = train_class_updt_df['x1'][index]   
        y1 = train_class_updt_df['y1'][index]
        x2 = train_class_updt_df['x2'][index]
        y2 = train_class_updt_df['y2'][index]
        X[index] = preprocess_input(np.array(img, dtype=np.float32))
        
        x1 = int((x1 * IMAGE_WIDTH) / IMAGE_SIZE) 
        x2 = int((x2 * IMAGE_WIDTH) / IMAGE_SIZE)   
        y1 = int((y1 * IMAGE_HEIGHT) / IMAGE_SIZE) 
        y2 = int((y2 * IMAGE_HEIGHT) / IMAGE_SIZE) 
        masks[index][y1:y2, x1:x2] = 1
    return X, masks
In [72]:
X, masks = maskImage(train_class_reduced_df)

X.shape

In [73]:
masks.shape
Out[73]:
(388, 224, 224)
In [74]:
def viewImage(n, X, y):
    f, ((ax1, ax2)) = plt.subplots(1, 2, figsize = (15, 8))
    ax1.imshow(X[n], cmap = plt.cm.bone)
    ax1.set_title('Original Image')
    ax1.axis('off')
    ax2.imshow(y[n], cmap = plt.cm.bone)
    ax2.set_title('Masked Image')
    ax2.axis('off')
    plt.show()
In [75]:
def splitData(X, masks,n):
    return X[0:n], masks[0:n],X[n:], masks[n:] 
In [76]:
#X_train, y_train, X_test, y_test = splitData(X, masks,25000)
In [77]:
X_train, X_test, y_train, y_test = train_test_split(X, masks, test_size=0.20, random_state=0)
In [78]:
X_train.shape, y_train.shape, X_test.shape, y_test.shape
Out[78]:
((310, 224, 224, 3), (310, 224, 224), (78, 224, 224, 3), (78, 224, 224))
In [79]:
viewImage(1, X_train, y_train)
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
In [80]:
viewImage(10, X_train, y_train)
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
In [81]:
X_train.shape
Out[81]:
(310, 224, 224, 3)

Model 1 : Predict Bounding Boxes

a) Mobile Net

In [82]:
def createMobileNetModel(trainable=True):
    mobnet = MobileNet(input_shape=(IMAGE_HEIGHT, IMAGE_WIDTH, 3), include_top=False, alpha=1.0, weights="imagenet")

    for layer in mobnet.layers:
        layer.trainable = trainable

    block1 = mobnet.get_layer("conv_pw_5_relu").output
    block2 = mobnet.get_layer("conv_pw_11_relu").output
    block3 = mobnet.get_layer("conv_pw_13_relu").output

    x = Concatenate()([UpSampling2D()(block3), block2])
    x = Concatenate()([UpSampling2D()(x), block1])
    x = Conv2D(1, kernel_size=1, activation="sigmoid")(x)
    x = UpSampling2D()(x)
    x = UpSampling2D()(x)
    x = UpSampling2D()(x)
    
    x = Reshape((IMAGE_HEIGHT, IMAGE_HEIGHT))(x)

    return Model(inputs=mobnet.input, outputs=x)
In [83]:
model = createMobileNetModel(False)
model.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D)       (None, 225, 225, 3)  0           input_1[0][0]                    
__________________________________________________________________________________________________
conv1 (Conv2D)                  (None, 112, 112, 32) 864         conv1_pad[0][0]                  
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, 112, 112, 32) 128         conv1[0][0]                      
__________________________________________________________________________________________________
conv1_relu (ReLU)               (None, 112, 112, 32) 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
conv_dw_1 (DepthwiseConv2D)     (None, 112, 112, 32) 288         conv1_relu[0][0]                 
__________________________________________________________________________________________________
conv_dw_1_bn (BatchNormalizatio (None, 112, 112, 32) 128         conv_dw_1[0][0]                  
__________________________________________________________________________________________________
conv_dw_1_relu (ReLU)           (None, 112, 112, 32) 0           conv_dw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_1 (Conv2D)              (None, 112, 112, 64) 2048        conv_dw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_1_bn (BatchNormalizatio (None, 112, 112, 64) 256         conv_pw_1[0][0]                  
__________________________________________________________________________________________________
conv_pw_1_relu (ReLU)           (None, 112, 112, 64) 0           conv_pw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_2 (ZeroPadding2D)      (None, 113, 113, 64) 0           conv_pw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_2 (DepthwiseConv2D)     (None, 56, 56, 64)   576         conv_pad_2[0][0]                 
__________________________________________________________________________________________________
conv_dw_2_bn (BatchNormalizatio (None, 56, 56, 64)   256         conv_dw_2[0][0]                  
__________________________________________________________________________________________________
conv_dw_2_relu (ReLU)           (None, 56, 56, 64)   0           conv_dw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_2 (Conv2D)              (None, 56, 56, 128)  8192        conv_dw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_2_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_2[0][0]                  
__________________________________________________________________________________________________
conv_pw_2_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_3 (DepthwiseConv2D)     (None, 56, 56, 128)  1152        conv_pw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_dw_3[0][0]                  
__________________________________________________________________________________________________
conv_dw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_dw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_3 (Conv2D)              (None, 56, 56, 128)  16384       conv_dw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_3[0][0]                  
__________________________________________________________________________________________________
conv_pw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_4 (ZeroPadding2D)      (None, 57, 57, 128)  0           conv_pw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_4 (DepthwiseConv2D)     (None, 28, 28, 128)  1152        conv_pad_4[0][0]                 
__________________________________________________________________________________________________
conv_dw_4_bn (BatchNormalizatio (None, 28, 28, 128)  512         conv_dw_4[0][0]                  
__________________________________________________________________________________________________
conv_dw_4_relu (ReLU)           (None, 28, 28, 128)  0           conv_dw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_4 (Conv2D)              (None, 28, 28, 256)  32768       conv_dw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_4_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_4[0][0]                  
__________________________________________________________________________________________________
conv_pw_4_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_5 (DepthwiseConv2D)     (None, 28, 28, 256)  2304        conv_pw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_dw_5[0][0]                  
__________________________________________________________________________________________________
conv_dw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_dw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_5 (Conv2D)              (None, 28, 28, 256)  65536       conv_dw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_5[0][0]                  
__________________________________________________________________________________________________
conv_pw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_6 (ZeroPadding2D)      (None, 29, 29, 256)  0           conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_6 (DepthwiseConv2D)     (None, 14, 14, 256)  2304        conv_pad_6[0][0]                 
__________________________________________________________________________________________________
conv_dw_6_bn (BatchNormalizatio (None, 14, 14, 256)  1024        conv_dw_6[0][0]                  
__________________________________________________________________________________________________
conv_dw_6_relu (ReLU)           (None, 14, 14, 256)  0           conv_dw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_6 (Conv2D)              (None, 14, 14, 512)  131072      conv_dw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_6_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_6[0][0]                  
__________________________________________________________________________________________________
conv_pw_6_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_7 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_7[0][0]                  
__________________________________________________________________________________________________
conv_dw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_7 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_7[0][0]                  
__________________________________________________________________________________________________
conv_pw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_8 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_8[0][0]                  
__________________________________________________________________________________________________
conv_dw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_8 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_8[0][0]                  
__________________________________________________________________________________________________
conv_pw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_9 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_9[0][0]                  
__________________________________________________________________________________________________
conv_dw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_9 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_9[0][0]                  
__________________________________________________________________________________________________
conv_pw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_10 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_10[0][0]                 
__________________________________________________________________________________________________
conv_dw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_10 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_10[0][0]                 
__________________________________________________________________________________________________
conv_pw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_11 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_11[0][0]                 
__________________________________________________________________________________________________
conv_dw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_11 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_11[0][0]                 
__________________________________________________________________________________________________
conv_pw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pad_12 (ZeroPadding2D)     (None, 15, 15, 512)  0           conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_12 (DepthwiseConv2D)    (None, 7, 7, 512)    4608        conv_pad_12[0][0]                
__________________________________________________________________________________________________
conv_dw_12_bn (BatchNormalizati (None, 7, 7, 512)    2048        conv_dw_12[0][0]                 
__________________________________________________________________________________________________
conv_dw_12_relu (ReLU)          (None, 7, 7, 512)    0           conv_dw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_12 (Conv2D)             (None, 7, 7, 1024)   524288      conv_dw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_12_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_12[0][0]                 
__________________________________________________________________________________________________
conv_pw_12_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_13 (DepthwiseConv2D)    (None, 7, 7, 1024)   9216        conv_pw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_dw_13[0][0]                 
__________________________________________________________________________________________________
conv_dw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_dw_13_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_13 (Conv2D)             (None, 7, 7, 1024)   1048576     conv_dw_13_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_13[0][0]                 
__________________________________________________________________________________________________
conv_pw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_13_bn[0][0]              
__________________________________________________________________________________________________
up_sampling2d (UpSampling2D)    (None, 14, 14, 1024) 0           conv_pw_13_relu[0][0]            
__________________________________________________________________________________________________
concatenate (Concatenate)       (None, 14, 14, 1536) 0           up_sampling2d[0][0]              
                                                                 conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
up_sampling2d_1 (UpSampling2D)  (None, 28, 28, 1536) 0           concatenate[0][0]                
__________________________________________________________________________________________________
concatenate_1 (Concatenate)     (None, 28, 28, 1792) 0           up_sampling2d_1[0][0]            
                                                                 conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
conv2d (Conv2D)                 (None, 28, 28, 1)    1793        concatenate_1[0][0]              
__________________________________________________________________________________________________
up_sampling2d_2 (UpSampling2D)  (None, 56, 56, 1)    0           conv2d[0][0]                     
__________________________________________________________________________________________________
up_sampling2d_3 (UpSampling2D)  (None, 112, 112, 1)  0           up_sampling2d_2[0][0]            
__________________________________________________________________________________________________
up_sampling2d_4 (UpSampling2D)  (None, 224, 224, 1)  0           up_sampling2d_3[0][0]            
__________________________________________________________________________________________________
reshape (Reshape)               (None, 224, 224)     0           up_sampling2d_4[0][0]            
==================================================================================================
Total params: 3,230,657
Trainable params: 1,793
Non-trainable params: 3,228,864
__________________________________________________________________________________________________
In [84]:
def dice_coefficient(y_true, y_pred):
    numerator = 2 * tf.reduce_sum(y_true * y_pred)
    denominator = tf.reduce_sum(y_true + y_pred)

    return numerator / (denominator + tf.keras.backend.epsilon())
In [85]:
def loss(y_true, y_pred):
    return binary_crossentropy(y_true, y_pred) - tf.keras.backend.log(dice_coefficient(y_true, y_pred) + tf.keras.backend.epsilon())
In [86]:
optimizer = Adam(lr=1e-4, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)
model.compile(loss=loss, optimizer=optimizer, metrics=[dice_coefficient])
In [87]:
checkpoint = ModelCheckpoint("model-{val_loss:.2f}.h5", monitor="val_loss", verbose=1, save_best_only=True, save_weights_only=True)

stop = EarlyStopping(monitor="val_loss", patience=5)

reduce_lr = ReduceLROnPlateau(monitor="val_loss", factor=0.2, patience=3, min_lr=1e-6, verbose=1)
In [88]:
model.fit(X_train, y_train, epochs = 30, 
          batch_size = 1, callbacks = [checkpoint, reduce_lr, stop], validation_data = (X_test, y_test))
Train on 310 samples, validate on 78 samples
Epoch 1/30
309/310 [============================>.] - ETA: 0s - loss: 9.5977 - dice_coefficient: 0.0600
Epoch 00001: val_loss improved from inf to 9.48288, saving model to model-9.48.h5
310/310 [==============================] - 44s 142ms/sample - loss: 9.6194 - dice_coefficient: 0.0598 - val_loss: 9.4829 - val_dice_coefficient: 0.0588
Epoch 2/30
309/310 [============================>.] - ETA: 0s - loss: 9.2862 - dice_coefficient: 0.0914
Epoch 00002: val_loss improved from 9.48288 to 9.41808, saving model to model-9.42.h5
310/310 [==============================] - 34s 108ms/sample - loss: 9.2681 - dice_coefficient: 0.0912 - val_loss: 9.4181 - val_dice_coefficient: 0.0620
Epoch 3/30
309/310 [============================>.] - ETA: 0s - loss: 9.1480 - dice_coefficient: 0.1094
Epoch 00003: val_loss improved from 9.41808 to 9.35567, saving model to model-9.36.h5
310/310 [==============================] - 36s 115ms/sample - loss: 9.1241 - dice_coefficient: 0.1100 - val_loss: 9.3557 - val_dice_coefficient: 0.0688
Epoch 4/30
309/310 [============================>.] - ETA: 0s - loss: 9.0166 - dice_coefficient: 0.1245
Epoch 00004: val_loss improved from 9.35567 to 9.32570, saving model to model-9.33.h5
310/310 [==============================] - 34s 110ms/sample - loss: 9.0397 - dice_coefficient: 0.1241 - val_loss: 9.3257 - val_dice_coefficient: 0.0726
Epoch 5/30
309/310 [============================>.] - ETA: 0s - loss: 9.0026 - dice_coefficient: 0.1342
Epoch 00005: val_loss improved from 9.32570 to 9.25916, saving model to model-9.26.h5
310/310 [==============================] - 34s 108ms/sample - loss: 8.9796 - dice_coefficient: 0.1344 - val_loss: 9.2592 - val_dice_coefficient: 0.0813
Epoch 6/30
309/310 [============================>.] - ETA: 0s - loss: 8.9553 - dice_coefficient: 0.1434
Epoch 00006: val_loss improved from 9.25916 to 9.24724, saving model to model-9.25.h5
310/310 [==============================] - 34s 110ms/sample - loss: 8.9341 - dice_coefficient: 0.1434 - val_loss: 9.2472 - val_dice_coefficient: 0.0834
Epoch 7/30
309/310 [============================>.] - ETA: 0s - loss: 8.9215 - dice_coefficient: 0.1501
Epoch 00007: val_loss did not improve from 9.24724
310/310 [==============================] - 32s 102ms/sample - loss: 8.8973 - dice_coefficient: 0.1506 - val_loss: 9.2490 - val_dice_coefficient: 0.0835
Epoch 8/30
309/310 [============================>.] - ETA: 0s - loss: 8.8926 - dice_coefficient: 0.1562
Epoch 00008: val_loss did not improve from 9.24724
310/310 [==============================] - 31s 101ms/sample - loss: 8.8688 - dice_coefficient: 0.1566 - val_loss: 9.2800 - val_dice_coefficient: 0.0801
Epoch 9/30
309/310 [============================>.] - ETA: 0s - loss: 8.8201 - dice_coefficient: 0.1620
Epoch 00009: val_loss improved from 9.24724 to 9.22994, saving model to model-9.23.h5
310/310 [==============================] - 32s 103ms/sample - loss: 8.8444 - dice_coefficient: 0.1615 - val_loss: 9.2299 - val_dice_coefficient: 0.0871
Epoch 10/30
309/310 [============================>.] - ETA: 0s - loss: 8.8427 - dice_coefficient: 0.1675
Epoch 00010: val_loss did not improve from 9.22994
310/310 [==============================] - 32s 102ms/sample - loss: 8.8209 - dice_coefficient: 0.1675 - val_loss: 9.2656 - val_dice_coefficient: 0.0829
Epoch 11/30
309/310 [============================>.] - ETA: 0s - loss: 8.7776 - dice_coefficient: 0.1722
Epoch 00011: val_loss did not improve from 9.22994
310/310 [==============================] - 32s 102ms/sample - loss: 8.8014 - dice_coefficient: 0.1717 - val_loss: 9.2763 - val_dice_coefficient: 0.0821
Epoch 12/30
309/310 [============================>.] - ETA: 0s - loss: 8.8076 - dice_coefficient: 0.1752
Epoch 00012: val_loss did not improve from 9.22994

Epoch 00012: ReduceLROnPlateau reducing learning rate to 1.9999999494757503e-05.
310/310 [==============================] - 32s 103ms/sample - loss: 8.7841 - dice_coefficient: 0.1755 - val_loss: 9.3063 - val_dice_coefficient: 0.0792
Epoch 13/30
309/310 [============================>.] - ETA: 0s - loss: 8.7870 - dice_coefficient: 0.1802
Epoch 00013: val_loss did not improve from 9.22994
310/310 [==============================] - 32s 102ms/sample - loss: 8.7633 - dice_coefficient: 0.1806 - val_loss: 9.3077 - val_dice_coefficient: 0.0791
Epoch 14/30
309/310 [============================>.] - ETA: 0s - loss: 8.7842 - dice_coefficient: 0.1809
Epoch 00014: val_loss did not improve from 9.22994
310/310 [==============================] - 32s 103ms/sample - loss: 8.7602 - dice_coefficient: 0.1813 - val_loss: 9.3184 - val_dice_coefficient: 0.0780
Out[88]:
<tensorflow.python.keras.callbacks.History at 0x7f9d72300b00>
In [89]:
scores = model.evaluate(X_test, y_test, verbose = 1)
78/1 [====================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 5s 60ms/sample - loss: 1.8960 - dice_coefficient: 0.1621
In [90]:
print("Accuracy: ", scores[1])
print("Loss: ", scores[0])
Accuracy:  0.16208078
Loss:  2.0891477236380944
In [91]:
y_pred = model.predict(X_test, verbose = 1)
78/1 [====================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 5s 61ms/sample
In [92]:
viewImage(3, X_test, y_pred)
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).

b) Mobile Net with additional layers

In [93]:
def conv_block_simple(prevlayer, filters, prefix, strides=(1, 1)):
    conv = Conv2D(filters, (3, 3), padding = 'same', kernel_initializer = 'he_normal', strides = strides, name = prefix + '_conv')(prevlayer)
    conv = BatchNormalization(name = prefix + 'BatchNormalization')(conv)
    conv = Activation('relu', name = prefix + 'ActivationLayer')(conv)
    return conv

def createMobileNetModel2(trainable=True):
    model = MobileNet(input_shape=(IMAGE_HEIGHT, IMAGE_WIDTH, 3), include_top=False, alpha=ALPHA, weights="imagenet")

    for layer in model.layers:
        layer.trainable = trainable
    block1 = model.get_layer('conv_pw_13_relu').output
    block2 = model.get_layer('conv_pw_11_relu').output
    block3 = model.get_layer('conv_pw_5_relu').output
    block4 = model.get_layer('conv_pw_3_relu').output
    block5 = model.get_layer('conv_pw_1_relu').output

    up1 = Concatenate()([UpSampling2D()(block1), block2])
    conv6 = conv_block_simple(up1, 256, 'Conv_6_1')
    conv6 = conv_block_simple(conv6, 256, 'Conv_6_2')

    up2 = Concatenate()([UpSampling2D()(conv6), block3])
    conv7 = conv_block_simple(up2, 256, 'Conv_7_1')
    conv7 = conv_block_simple(conv7, 256, 'Conv_7_2')

    up3 = Concatenate()([UpSampling2D()(conv7), block4])
    conv8 = conv_block_simple(up3, 192, 'Conv_8_1')
    conv8 = conv_block_simple(conv8, 128, 'Conv_8_2')

    up4 = Concatenate()([UpSampling2D()(conv8), block5])
    conv9 = conv_block_simple(up4, 96, 'Conv_9_1')
    conv9 = conv_block_simple(conv9, 64, 'Conv_9_2')

    up5 = Concatenate()([UpSampling2D()(conv9), model.input])
    conv10 = conv_block_simple(up5, 48, 'Conv_10_1')
    conv10 = conv_block_simple(conv10, 32, 'Conv_10_2')
    conv10 = SpatialDropout2D(0.2)(conv10)
    
    x = Conv2D(1, (1, 1), activation = 'sigmoid')(conv10)
    x = Reshape((IMAGE_HEIGHT, IMAGE_HEIGHT))(x)
    return Model(inputs = model.input, outputs = x)
In [94]:
model = createMobileNetModel2(False)
model.summary()
Model: "model_1"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_2 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv1_pad (ZeroPadding2D)       (None, 225, 225, 3)  0           input_2[0][0]                    
__________________________________________________________________________________________________
conv1 (Conv2D)                  (None, 112, 112, 32) 864         conv1_pad[0][0]                  
__________________________________________________________________________________________________
conv1_bn (BatchNormalization)   (None, 112, 112, 32) 128         conv1[0][0]                      
__________________________________________________________________________________________________
conv1_relu (ReLU)               (None, 112, 112, 32) 0           conv1_bn[0][0]                   
__________________________________________________________________________________________________
conv_dw_1 (DepthwiseConv2D)     (None, 112, 112, 32) 288         conv1_relu[0][0]                 
__________________________________________________________________________________________________
conv_dw_1_bn (BatchNormalizatio (None, 112, 112, 32) 128         conv_dw_1[0][0]                  
__________________________________________________________________________________________________
conv_dw_1_relu (ReLU)           (None, 112, 112, 32) 0           conv_dw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_1 (Conv2D)              (None, 112, 112, 64) 2048        conv_dw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_1_bn (BatchNormalizatio (None, 112, 112, 64) 256         conv_pw_1[0][0]                  
__________________________________________________________________________________________________
conv_pw_1_relu (ReLU)           (None, 112, 112, 64) 0           conv_pw_1_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_2 (ZeroPadding2D)      (None, 113, 113, 64) 0           conv_pw_1_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_2 (DepthwiseConv2D)     (None, 56, 56, 64)   576         conv_pad_2[0][0]                 
__________________________________________________________________________________________________
conv_dw_2_bn (BatchNormalizatio (None, 56, 56, 64)   256         conv_dw_2[0][0]                  
__________________________________________________________________________________________________
conv_dw_2_relu (ReLU)           (None, 56, 56, 64)   0           conv_dw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_2 (Conv2D)              (None, 56, 56, 128)  8192        conv_dw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_2_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_2[0][0]                  
__________________________________________________________________________________________________
conv_pw_2_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_2_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_3 (DepthwiseConv2D)     (None, 56, 56, 128)  1152        conv_pw_2_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_dw_3[0][0]                  
__________________________________________________________________________________________________
conv_dw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_dw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_3 (Conv2D)              (None, 56, 56, 128)  16384       conv_dw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_3_bn (BatchNormalizatio (None, 56, 56, 128)  512         conv_pw_3[0][0]                  
__________________________________________________________________________________________________
conv_pw_3_relu (ReLU)           (None, 56, 56, 128)  0           conv_pw_3_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_4 (ZeroPadding2D)      (None, 57, 57, 128)  0           conv_pw_3_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_4 (DepthwiseConv2D)     (None, 28, 28, 128)  1152        conv_pad_4[0][0]                 
__________________________________________________________________________________________________
conv_dw_4_bn (BatchNormalizatio (None, 28, 28, 128)  512         conv_dw_4[0][0]                  
__________________________________________________________________________________________________
conv_dw_4_relu (ReLU)           (None, 28, 28, 128)  0           conv_dw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_4 (Conv2D)              (None, 28, 28, 256)  32768       conv_dw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_4_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_4[0][0]                  
__________________________________________________________________________________________________
conv_pw_4_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_4_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_5 (DepthwiseConv2D)     (None, 28, 28, 256)  2304        conv_pw_4_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_dw_5[0][0]                  
__________________________________________________________________________________________________
conv_dw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_dw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_5 (Conv2D)              (None, 28, 28, 256)  65536       conv_dw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_5_bn (BatchNormalizatio (None, 28, 28, 256)  1024        conv_pw_5[0][0]                  
__________________________________________________________________________________________________
conv_pw_5_relu (ReLU)           (None, 28, 28, 256)  0           conv_pw_5_bn[0][0]               
__________________________________________________________________________________________________
conv_pad_6 (ZeroPadding2D)      (None, 29, 29, 256)  0           conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_6 (DepthwiseConv2D)     (None, 14, 14, 256)  2304        conv_pad_6[0][0]                 
__________________________________________________________________________________________________
conv_dw_6_bn (BatchNormalizatio (None, 14, 14, 256)  1024        conv_dw_6[0][0]                  
__________________________________________________________________________________________________
conv_dw_6_relu (ReLU)           (None, 14, 14, 256)  0           conv_dw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_6 (Conv2D)              (None, 14, 14, 512)  131072      conv_dw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_6_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_6[0][0]                  
__________________________________________________________________________________________________
conv_pw_6_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_6_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_7 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_6_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_7[0][0]                  
__________________________________________________________________________________________________
conv_dw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_7 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_7_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_7[0][0]                  
__________________________________________________________________________________________________
conv_pw_7_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_7_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_8 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_7_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_8[0][0]                  
__________________________________________________________________________________________________
conv_dw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_8 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_8_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_8[0][0]                  
__________________________________________________________________________________________________
conv_pw_8_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_8_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_9 (DepthwiseConv2D)     (None, 14, 14, 512)  4608        conv_pw_8_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_dw_9[0][0]                  
__________________________________________________________________________________________________
conv_dw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_dw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_pw_9 (Conv2D)              (None, 14, 14, 512)  262144      conv_dw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_pw_9_bn (BatchNormalizatio (None, 14, 14, 512)  2048        conv_pw_9[0][0]                  
__________________________________________________________________________________________________
conv_pw_9_relu (ReLU)           (None, 14, 14, 512)  0           conv_pw_9_bn[0][0]               
__________________________________________________________________________________________________
conv_dw_10 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_9_relu[0][0]             
__________________________________________________________________________________________________
conv_dw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_10[0][0]                 
__________________________________________________________________________________________________
conv_dw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_10 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_10_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_10[0][0]                 
__________________________________________________________________________________________________
conv_pw_10_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_10_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_11 (DepthwiseConv2D)    (None, 14, 14, 512)  4608        conv_pw_10_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_dw_11[0][0]                 
__________________________________________________________________________________________________
conv_dw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_dw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_11 (Conv2D)             (None, 14, 14, 512)  262144      conv_dw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_11_bn (BatchNormalizati (None, 14, 14, 512)  2048        conv_pw_11[0][0]                 
__________________________________________________________________________________________________
conv_pw_11_relu (ReLU)          (None, 14, 14, 512)  0           conv_pw_11_bn[0][0]              
__________________________________________________________________________________________________
conv_pad_12 (ZeroPadding2D)     (None, 15, 15, 512)  0           conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_12 (DepthwiseConv2D)    (None, 7, 7, 512)    4608        conv_pad_12[0][0]                
__________________________________________________________________________________________________
conv_dw_12_bn (BatchNormalizati (None, 7, 7, 512)    2048        conv_dw_12[0][0]                 
__________________________________________________________________________________________________
conv_dw_12_relu (ReLU)          (None, 7, 7, 512)    0           conv_dw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_12 (Conv2D)             (None, 7, 7, 1024)   524288      conv_dw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_12_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_12[0][0]                 
__________________________________________________________________________________________________
conv_pw_12_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_12_bn[0][0]              
__________________________________________________________________________________________________
conv_dw_13 (DepthwiseConv2D)    (None, 7, 7, 1024)   9216        conv_pw_12_relu[0][0]            
__________________________________________________________________________________________________
conv_dw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_dw_13[0][0]                 
__________________________________________________________________________________________________
conv_dw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_dw_13_bn[0][0]              
__________________________________________________________________________________________________
conv_pw_13 (Conv2D)             (None, 7, 7, 1024)   1048576     conv_dw_13_relu[0][0]            
__________________________________________________________________________________________________
conv_pw_13_bn (BatchNormalizati (None, 7, 7, 1024)   4096        conv_pw_13[0][0]                 
__________________________________________________________________________________________________
conv_pw_13_relu (ReLU)          (None, 7, 7, 1024)   0           conv_pw_13_bn[0][0]              
__________________________________________________________________________________________________
up_sampling2d_5 (UpSampling2D)  (None, 14, 14, 1024) 0           conv_pw_13_relu[0][0]            
__________________________________________________________________________________________________
concatenate_2 (Concatenate)     (None, 14, 14, 1536) 0           up_sampling2d_5[0][0]            
                                                                 conv_pw_11_relu[0][0]            
__________________________________________________________________________________________________
Conv_6_1_conv (Conv2D)          (None, 14, 14, 256)  3539200     concatenate_2[0][0]              
__________________________________________________________________________________________________
Conv_6_1BatchNormalization (Bat (None, 14, 14, 256)  1024        Conv_6_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_6_1ActivationLayer (Activa (None, 14, 14, 256)  0           Conv_6_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_6_2_conv (Conv2D)          (None, 14, 14, 256)  590080      Conv_6_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_6_2BatchNormalization (Bat (None, 14, 14, 256)  1024        Conv_6_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_6_2ActivationLayer (Activa (None, 14, 14, 256)  0           Conv_6_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_6 (UpSampling2D)  (None, 28, 28, 256)  0           Conv_6_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_3 (Concatenate)     (None, 28, 28, 512)  0           up_sampling2d_6[0][0]            
                                                                 conv_pw_5_relu[0][0]             
__________________________________________________________________________________________________
Conv_7_1_conv (Conv2D)          (None, 28, 28, 256)  1179904     concatenate_3[0][0]              
__________________________________________________________________________________________________
Conv_7_1BatchNormalization (Bat (None, 28, 28, 256)  1024        Conv_7_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_7_1ActivationLayer (Activa (None, 28, 28, 256)  0           Conv_7_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_7_2_conv (Conv2D)          (None, 28, 28, 256)  590080      Conv_7_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_7_2BatchNormalization (Bat (None, 28, 28, 256)  1024        Conv_7_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_7_2ActivationLayer (Activa (None, 28, 28, 256)  0           Conv_7_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_7 (UpSampling2D)  (None, 56, 56, 256)  0           Conv_7_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_4 (Concatenate)     (None, 56, 56, 384)  0           up_sampling2d_7[0][0]            
                                                                 conv_pw_3_relu[0][0]             
__________________________________________________________________________________________________
Conv_8_1_conv (Conv2D)          (None, 56, 56, 192)  663744      concatenate_4[0][0]              
__________________________________________________________________________________________________
Conv_8_1BatchNormalization (Bat (None, 56, 56, 192)  768         Conv_8_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_8_1ActivationLayer (Activa (None, 56, 56, 192)  0           Conv_8_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_8_2_conv (Conv2D)          (None, 56, 56, 128)  221312      Conv_8_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_8_2BatchNormalization (Bat (None, 56, 56, 128)  512         Conv_8_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_8_2ActivationLayer (Activa (None, 56, 56, 128)  0           Conv_8_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_8 (UpSampling2D)  (None, 112, 112, 128 0           Conv_8_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_5 (Concatenate)     (None, 112, 112, 192 0           up_sampling2d_8[0][0]            
                                                                 conv_pw_1_relu[0][0]             
__________________________________________________________________________________________________
Conv_9_1_conv (Conv2D)          (None, 112, 112, 96) 165984      concatenate_5[0][0]              
__________________________________________________________________________________________________
Conv_9_1BatchNormalization (Bat (None, 112, 112, 96) 384         Conv_9_1_conv[0][0]              
__________________________________________________________________________________________________
Conv_9_1ActivationLayer (Activa (None, 112, 112, 96) 0           Conv_9_1BatchNormalization[0][0] 
__________________________________________________________________________________________________
Conv_9_2_conv (Conv2D)          (None, 112, 112, 64) 55360       Conv_9_1ActivationLayer[0][0]    
__________________________________________________________________________________________________
Conv_9_2BatchNormalization (Bat (None, 112, 112, 64) 256         Conv_9_2_conv[0][0]              
__________________________________________________________________________________________________
Conv_9_2ActivationLayer (Activa (None, 112, 112, 64) 0           Conv_9_2BatchNormalization[0][0] 
__________________________________________________________________________________________________
up_sampling2d_9 (UpSampling2D)  (None, 224, 224, 64) 0           Conv_9_2ActivationLayer[0][0]    
__________________________________________________________________________________________________
concatenate_6 (Concatenate)     (None, 224, 224, 67) 0           up_sampling2d_9[0][0]            
                                                                 input_2[0][0]                    
__________________________________________________________________________________________________
Conv_10_1_conv (Conv2D)         (None, 224, 224, 48) 28992       concatenate_6[0][0]              
__________________________________________________________________________________________________
Conv_10_1BatchNormalization (Ba (None, 224, 224, 48) 192         Conv_10_1_conv[0][0]             
__________________________________________________________________________________________________
Conv_10_1ActivationLayer (Activ (None, 224, 224, 48) 0           Conv_10_1BatchNormalization[0][0]
__________________________________________________________________________________________________
Conv_10_2_conv (Conv2D)         (None, 224, 224, 32) 13856       Conv_10_1ActivationLayer[0][0]   
__________________________________________________________________________________________________
Conv_10_2BatchNormalization (Ba (None, 224, 224, 32) 128         Conv_10_2_conv[0][0]             
__________________________________________________________________________________________________
Conv_10_2ActivationLayer (Activ (None, 224, 224, 32) 0           Conv_10_2BatchNormalization[0][0]
__________________________________________________________________________________________________
spatial_dropout2d (SpatialDropo (None, 224, 224, 32) 0           Conv_10_2ActivationLayer[0][0]   
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, 224, 224, 1)  33          spatial_dropout2d[0][0]          
__________________________________________________________________________________________________
reshape_1 (Reshape)             (None, 224, 224)     0           conv2d_1[0][0]                   
==================================================================================================
Total params: 10,283,745
Trainable params: 7,051,713
Non-trainable params: 3,232,032
__________________________________________________________________________________________________
In [95]:
model.compile(loss = loss, optimizer = optimizer, metrics = [dice_coefficient])
model.fit(X_train, y_train, 
           epochs = 30, batch_size = 32, callbacks = [checkpoint, reduce_lr, stop], validation_data = (X_test, y_test))
Train on 310 samples, validate on 78 samples
Epoch 1/30
288/310 [==========================>...] - ETA: 20s - loss: 3.0713 - dice_coefficient: 0.0942
Epoch 00001: val_loss improved from 9.22994 to 5.92493, saving model to model-5.92.h5
310/310 [==============================] - 308s 995ms/sample - loss: 3.0650 - dice_coefficient: 0.0946 - val_loss: 5.9249 - val_dice_coefficient: 0.0040
Epoch 2/30
288/310 [==========================>...] - ETA: 19s - loss: 2.5942 - dice_coefficient: 0.1459
Epoch 00002: val_loss improved from 5.92493 to 3.20953, saving model to model-3.21.h5
310/310 [==============================] - 296s 956ms/sample - loss: 2.6646 - dice_coefficient: 0.1368 - val_loss: 3.2095 - val_dice_coefficient: 0.0546
Epoch 3/30
288/310 [==========================>...] - ETA: 19s - loss: 2.3640 - dice_coefficient: 0.1674
Epoch 00003: val_loss improved from 3.20953 to 3.02380, saving model to model-3.02.h5
310/310 [==============================] - 293s 947ms/sample - loss: 2.3466 - dice_coefficient: 0.1699 - val_loss: 3.0238 - val_dice_coefficient: 0.0639
Epoch 4/30
288/310 [==========================>...] - ETA: 19s - loss: 2.0736 - dice_coefficient: 0.1999
Epoch 00004: val_loss improved from 3.02380 to 3.00065, saving model to model-3.00.h5
310/310 [==============================] - 295s 951ms/sample - loss: 2.1059 - dice_coefficient: 0.1923 - val_loss: 3.0007 - val_dice_coefficient: 0.0736
Epoch 5/30
288/310 [==========================>...] - ETA: 19s - loss: 1.8241 - dice_coefficient: 0.2286
Epoch 00005: val_loss did not improve from 3.00065
310/310 [==============================] - 294s 949ms/sample - loss: 1.8366 - dice_coefficient: 0.2250 - val_loss: 3.4052 - val_dice_coefficient: 0.0513
Epoch 6/30
288/310 [==========================>...] - ETA: 19s - loss: 1.6653 - dice_coefficient: 0.2519
Epoch 00006: val_loss did not improve from 3.00065
310/310 [==============================] - 293s 945ms/sample - loss: 1.6459 - dice_coefficient: 0.2580 - val_loss: 3.2087 - val_dice_coefficient: 0.0648
Epoch 7/30
288/310 [==========================>...] - ETA: 19s - loss: 1.5836 - dice_coefficient: 0.2690
Epoch 00007: val_loss did not improve from 3.00065

Epoch 00007: ReduceLROnPlateau reducing learning rate to 3.999999898951501e-06.
310/310 [==============================] - 290s 935ms/sample - loss: 1.5612 - dice_coefficient: 0.2764 - val_loss: 3.4056 - val_dice_coefficient: 0.0596
Epoch 8/30
288/310 [==========================>...] - ETA: 19s - loss: 1.4728 - dice_coefficient: 0.2883
Epoch 00008: val_loss did not improve from 3.00065
310/310 [==============================] - 290s 935ms/sample - loss: 1.4748 - dice_coefficient: 0.2873 - val_loss: 3.2789 - val_dice_coefficient: 0.0654
Epoch 9/30
288/310 [==========================>...] - ETA: 19s - loss: 1.4524 - dice_coefficient: 0.2926
Epoch 00009: val_loss did not improve from 3.00065
310/310 [==============================] - 292s 943ms/sample - loss: 1.4563 - dice_coefficient: 0.2910 - val_loss: 3.1390 - val_dice_coefficient: 0.0734
Out[95]:
<tensorflow.python.keras.callbacks.History at 0x7f9d30718710>
In [96]:
scores = model.evaluate(X_test, y_test, verbose = 1)
78/1 [====================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 20s 256ms/sample - loss: 3.0713 - dice_coefficient: 0.0734
In [97]:
print("Accuracy: ", scores[1])
print("Loss: ", scores[0])
Accuracy:  0.07336023
Loss:  3.139011859893799
In [98]:
y_pred = model.predict(X_test, verbose = 1)
78/1 [====================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 21s 268ms/sample
In [99]:
viewImage(3, X_test, y_pred)
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).

Model 2: Predict Pneumonia

In [100]:
path_class_reduced_target = train_class_df.sample(frac = 0.02) 
In [101]:
path_class_reduced_target = downSample(path_class_reduced_target)
1    183
0    183
Name: Target, dtype: int64
In [102]:
path_class_reduced_target.head()
Out[102]:
index patientId x1 y1 width height Target class path PatientSex PatientAge ViewPosition x2 y2
0 21219 c0767025-7482-446c-a5cb-eecee44b226c 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/c0767025-7482-446c-a5cb-e... M 22.0 AP 0.0 0.0
1 2671 336b379f-b679-42cb-9e9b-e2483a4cae2c 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/336b379f-b679-42cb-9e9b-e... F 38.0 PA 0.0 0.0
2 18067 ab6db814-4920-4095-8d8b-3d5f2450c156 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/ab6db814-4920-4095-8d8b-3... M 54.0 PA 0.0 0.0
3 10656 70f14ca7-41a0-445f-9274-fe0a7b8890be 0.0 0.0 0.0 0.0 0 No Lung Opacity / Not Normal stage_2_train_images/70f14ca7-41a0-445f-9274-f... M 48.0 AP 0.0 0.0
4 6992 535c2337-e755-43ea-ab1d-6547a584776d 0.0 0.0 0.0 0.0 0 Normal stage_2_train_images/535c2337-e755-43ea-ab1d-6... F 38.0 PA 0.0 0.0
In [103]:
# A dataframe with paths, classes and targets
print('Prepare a dataframe with paths, classes and targets'); print('--'*40)
path_class_target = path_class_reduced_target[['patientId', 'path', 'class', 'Target']].copy(deep = True)
path_class_target['path'] = (path_class_target['path']
                             .str.replace('stage_2_train_images', 'JPG_train')
                             .str.replace('.dcm', '.jpg'))
path_class_target.drop_duplicates(inplace = True)
path_class_reduced_df = path_class_target.reset_index()
display(path_class_target.shape, path_class_target.nunique())
print('\nDistribution of target and classes')
display(path_class_reduced_df['Target'].value_counts())
print()
display(path_class_reduced_df['class'].value_counts())
Prepare a dataframe with paths, classes and targets
--------------------------------------------------------------------------------
(364, 4)
patientId    364
path         364
class          3
Target         2
dtype: int64
Distribution of target and classes
0    183
1    181
Name: Target, dtype: int64

Lung Opacity                    181
No Lung Opacity / Not Normal    114
Normal                           69
Name: class, dtype: int64
In [104]:
#path_class_target.head(10)
In [105]:
#path_class_reduced_df = path_class_target.sample(frac = 0.02) 
In [106]:
#path_class_reduced_df.shape
In [107]:
# Split the data in train, valid and test sets
print('Split the data in train, valid and test sets'); print('--'*40)

image_list = list(path_class_reduced_df['path'])
random.shuffle(image_list)
test_size = round(len(image_list)/10)
val_size = round(len(image_list)/10)
train_size = len(image_list)-test_size-val_size

X_train = image_list[:train_size]
X_valid = image_list[train_size:(train_size + val_size)]
X_test = image_list[(train_size + val_size):]
Split the data in train, valid and test sets
--------------------------------------------------------------------------------
In [108]:
print('Create Training, Validation and Test Dataframe with Path and Target'); print('--'*40)

df_train = (path_class_reduced_df.merge(pd.Series(X_train, name = 'path'), 
                                    on = 'path', 
                                    how = 'right')
          .drop(['class'], axis = 1))

df_valid = (path_class_reduced_df.merge(pd.Series(X_valid, name = 'path'), 
                                    on = 'path', 
                                    how = 'right')
          .drop(['class'], axis = 1))

df_test = (path_class_reduced_df.merge(pd.Series(X_test, name = 'path'), 
                                    on = 'path', 
                                    how = 'right')
          .drop(['class'], axis = 1))

print('Shape of the dataframes:\nTrain:{}\nValid:{}\nTest:{}'.format(df_train.shape, df_valid.shape, df_test.shape))
Create Training, Validation and Test Dataframe with Path and Target
--------------------------------------------------------------------------------
Shape of the dataframes:
Train:(292, 4)
Valid:(36, 4)
Test:(36, 4)
In [109]:
df_train.shape
Out[109]:
(292, 4)
In [110]:
df_train.head()
Out[110]:
index patientId path Target
0 333 8aaeb451-26b5-4192-9a82-1d7eded2e363 JPG_train/8aaeb451-26b5-4192-9a82-1d7eded2e363... 1
1 336 0c77d941-5f63-4e2a-9c1e-4a58043acd55 JPG_train/0c77d941-5f63-4e2a-9c1e-4a58043acd55... 1
2 58 caa8b9fe-ab6b-49d6-9897-d46833013c6b JPG_train/caa8b9fe-ab6b-49d6-9897-d46833013c6b... 0
3 323 a8521ca7-b65f-42e9-83e2-f3dddf811eb3 JPG_train/a8521ca7-b65f-42e9-83e2-f3dddf811eb3... 1
4 267 c5a8e91a-2d40-4355-91af-73abf5f0fdf1 JPG_train/c5a8e91a-2d40-4355-91af-73abf5f0fdf1... 1
In [111]:
df_train.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 292 entries, 0 to 291
Data columns (total 4 columns):
 #   Column     Non-Null Count  Dtype 
---  ------     --------------  ----- 
 0   index      292 non-null    int64 
 1   patientId  292 non-null    object
 2   path       292 non-null    object
 3   Target     292 non-null    int64 
dtypes: int64(2), object(2)
memory usage: 11.4+ KB
In [112]:
print('Training, Validation and Test set is ~equally distributed on target'); print('--'*40)
print('Distribution of target in the training set:'); 
display(pd.Series(df_train['Target']).value_counts(normalize = True).round(2))
print('\nDistribution of target in the validation set:'); 
display(pd.Series(df_valid['Target']).value_counts(normalize = True).round(2))
print('\nDistribution of target in the test set:'); 
display(pd.Series(df_test['Target']).value_counts(normalize = True).round(2))
Training, Validation and Test set is ~equally distributed on target
--------------------------------------------------------------------------------
Distribution of target in the training set:
1    0.51
0    0.49
Name: Target, dtype: float64
Distribution of target in the validation set:
0    0.56
1    0.44
Name: Target, dtype: float64
Distribution of target in the test set:
0    0.56
1    0.44
Name: Target, dtype: float64
In [113]:
print('Save the train, valid and test dataframes for future use');print('--'*40)
df_train.to_pickle('train_data.pkl')
df_valid.to_pickle('valid_data.pkl')
df_test.to_pickle('test_data.pkl')
Save the train, valid and test dataframes for future use
--------------------------------------------------------------------------------
In [114]:
# Data generator
class DataGenerators:
    def __init__(self, df_train, df_valid, df_test, batch_size, path,
                 img_size = (224, 224), class_mode = 'binary',
                 random_state = 2020):
        self.df_train = df_train
        self.df_valid = df_valid
        self.df_test = df_test
        self.batch_size = batch_size
        self.img_size = img_size
        self.path = path
        self.class_mode = class_mode
        
        train_augmenter = ImageDataGenerator(
            preprocessing_function = preprocess_input,
            rotation_range = 20, width_shift_range = 0.2,
            height_shift_range = 0.2, zoom_range = 0.2,
            horizontal_flip = True, rescale = 1/255.
            )
        
        valid_augmenter = ImageDataGenerator(
            preprocessing_function = preprocess_input, 
            rescale = 1/255.
            )
        
        test_augmenter = ImageDataGenerator(
            preprocessing_function = preprocess_input,
            rescale = 1/255.
            )
        
        print('Train Generator Created', '--'*20)
        self.train_generator = train_augmenter.flow_from_dataframe(
            x_col = 'path',
            y_col = 'Target',
            dataframe = self.df_train,
            batch_size = self.batch_size,
            target_size = self.img_size,
            directory = self.path,
            class_mode = self.class_mode,
            seed = random_state,
            shuffle = True
            )
        print('Validation Generator Created', '--'*20)
        self.valid_generator = valid_augmenter.flow_from_dataframe(
            x_col = 'path',
            y_col = 'Target',
            dataframe = self.df_valid,
            batch_size = self.batch_size,
            target_size = self.img_size,
            directory = self.path,
            class_mode = self.class_mode,
            seed = random_state,
            shuffle = False
            )
        print('Test Generator Created', '--'*20)
        self.test_generator = test_augmenter.flow_from_dataframe(
            x_col = 'path',
            y_col = 'Target',
            dataframe = self.df_test,
            batch_size = self.batch_size,
            target_size = self.img_size,
            directory = self.path,
            class_mode = self.class_mode,
            seed = random_state,
            shuffle = False
            )
        
        self.step_size_train = math.ceil(
            self.train_generator.n//self.train_generator.batch_size + 1
            )
        self.step_size_valid = math.ceil(
            self.valid_generator.n//self.valid_generator.batch_size + 1
            )
        self.step_size_test = math.ceil(
            self.test_generator.n//self.test_generator.batch_size + 1
            )
In [115]:
df_train['Target'] = df_train['Target'].astype(str); 
df_valid['Target'] = df_valid['Target'].astype(str); 
df_test['Target'] = df_test['Target'].astype(str)
In [116]:
df_train.head(3)
Out[116]:
index patientId path Target
0 333 8aaeb451-26b5-4192-9a82-1d7eded2e363 JPG_train/8aaeb451-26b5-4192-9a82-1d7eded2e363... 1
1 336 0c77d941-5f63-4e2a-9c1e-4a58043acd55 JPG_train/0c77d941-5f63-4e2a-9c1e-4a58043acd55... 1
2 58 caa8b9fe-ab6b-49d6-9897-d46833013c6b JPG_train/caa8b9fe-ab6b-49d6-9897-d46833013c6b... 0
In [117]:
TRAIN_IMAGES_DIR = os.path.join('/Volumes/Ayon_Drive/GreatLearning/Capstone_Pneumonia/')
In [118]:
print('Create generators for training, validation and test dataframes'); print('--'*40)
generators = DataGenerators(df_train, df_valid, df_test, 
                            batch_size = 32, 
                            path = TRAIN_IMAGES_DIR, 
                            img_size = (224, 224), 
                            class_mode = 'binary',
                            random_state = 2020)
Create generators for training, validation and test dataframes
--------------------------------------------------------------------------------
Train Generator Created ----------------------------------------
Found 292 validated image filenames belonging to 2 classes.
Validation Generator Created ----------------------------------------
Found 36 validated image filenames belonging to 2 classes.
Test Generator Created ----------------------------------------
Found 36 validated image filenames belonging to 2 classes.
In [119]:
# ROC AUC as a Metric
# Reference: https://stackoverflow.com/questions/41032551/how-to-compute-receiving-operating-characteristic-roc-and-auc-in-keras
def roc_auc(y_true, y_pred):
    return tf.compat.v1.py_function(roc_auc_score, (y_true, y_pred), tf.double)

# Average Precision as a Metric
import tensorflow.keras.backend as K
def average_precision(y_true, y_pred):
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
    predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
    precision = true_positives / (predicted_positives + K.epsilon())
    return precision

# F1 score as a Metric
# Reference: https://stackoverflow.com/questions/43547402/how-to-calculate-f1-macro-in-keras
def f1_score(y_true, y_pred):
    def recall(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
        recall = true_positives / (possible_positives + K.epsilon())
        return recall

    def precision(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
        precision = true_positives / (predicted_positives + K.epsilon())
        return precision
    precision = precision(y_true, y_pred)
    recall = recall(y_true, y_pred)
    return 2*((precision*recall)/(precision+recall+K.epsilon()))
In [120]:
# Model Parameters
BATCH_SIZE = 32
IMAGE_SIZE = 224
EPOCH = 10
LEARNING_RATE = 1e-4
MONITOR = 'val_loss'
MODE = 'min'
VERBOSE = 1
FACTOR = 0.1
PATIENCE = 5
COOLDOWN = 5
BEST_MODEL = 'best_densenet.h5'
FINAL_MODEL = 'best_densenet_final.h5'
LOG_FILE = 'logs.csv'
LOSS = 'binary_crossentropy'
METRICS = ['accuracy', average_precision, f1_score]
In [131]:
def buildModel(MODEL):
    #print('Create a' + str(MODEL) + ' model'); print('--'*40)
    input_shape = (IMAGE_SIZE, IMAGE_SIZE, 3)
    inputs = Input(shape = input_shape)
    initializer = tf.keras.initializers.GlorotNormal()
    
    base_model = MODEL(include_top = False, input_tensor = inputs, weights ='imagenet')
    
    for layer in base_model.layers[:-12]:
        layer.trainable = False
    
    for layer in base_model.layers:
        print(layer,layer.trainable)
    
    model = Sequential(name = 'DenseNet121')
    model.add(base_model)
    model.add(GlobalAveragePooling2D())
    model.add(Dropout(0.4))
    model.add(Dense(1, activation = 'sigmoid', kernel_initializer = initializer))
    model.summary()
    return model

def callback_model():
    print("in call backs")
    lrscheduler = ReduceLROnPlateau(monitor = MONITOR, factor = FACTOR, 
                                    patience = PATIENCE, verbose = VERBOSE, 
                                    mode = MODE, cooldown = COOLDOWN)
    
    model.compile(optimizer = Adam(lr = LEARNING_RATE), loss = LOSS, metrics = METRICS)
    
    cp = ModelCheckpoint(filepath = MODEL_WEIGHTS + BEST_MODEL, monitor = MONITOR, 
                         verbose = VERBOSE, save_best_only = True, mode = MODE)
    
    if os.path.exists(MODEL_WEIGHTS + LOG_FILE): os.remove(MODEL_WEIGHTS + LOG_FILE)
    csv_logger = CSVLogger(MODEL_WEIGHTS + LOG_FILE, append = True)
    
    callbacks = [cp, csv_logger, lrscheduler]
    return callbacks

def evaluateValidationData(model):
    
    ##Evaluate on validation data
    print('Evaluate the model on validation data'); print('--'*40)

    loss, accuracy, ap, f1 = model.evaluate_generator(generator = validation_generator, 
                                          steps = generators.step_size_valid)
    print(f'Loss: {round(loss, 3)}, Accuracy: {round(float(accuracy), 3)},  AP: {round(float(ap), 3)}, F1 Score: {round(float(f1), 3)}')
    
    ##Prediction on validation data
    print('Predict on the validation data'); print('--'*40)
    validation_generator.reset()
    valid_pred_roc = model.predict_generator(generator = validation_generator,
                                             steps = generators.step_size_valid,
                                             verbose = 1)
    valid_pred = []
    for i in valid_pred_roc:
        if i >= 0.5: valid_pred.append(1)
        else: valid_pred.append(0)
    y_valid = df_valid['Target'].astype(int).values
    x_valid = df_valid['path']
    
    return valid_pred, y_valid, x_valid, valid_pred_roc

def evaluateTestData(y_valid):
    
    ##Prediction on test data
    print('Predict on the test data'); print('--'*40)
    test_generator.reset()
    test_pred_roc = model.predict_generator(generator = test_generator,
                                            steps = generators.step_size_test,
                                            verbose = 1)
    test_pred = []
    for i in test_pred_roc:
        if i >= 0.5: test_pred.append(1)
        else: test_pred.append(0)
    y_test = df_test['Target'].astype(int).values
    x_test = df_test['path']
          
    display(pd.Series(y_valid).value_counts(), pd.Series(y_test).value_counts())
    
    correct = np.nonzero(test_pred == y_test)[0]
    incorrect = np.nonzero(test_pred != y_test)[0]
    percentage = ((correct.size)/(correct.size + incorrect.size)) * 100
    
    print("Correctly predicted %d images out of %d images" %(correct.size, correct.size+incorrect.size))
    print("Predicted %.0f%% test images correctly" %(correct.size/(correct.size+incorrect.size)*100))
    
    return test_pred, y_test, x_test, correct, incorrect, test_pred_roc

def evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test):
    
    print('ROC Curve for the validation data'); print('--'*40)

    roc_auc_valid = roc_auc_score(y_valid, np.array(valid_pred_roc).reshape(-1))
    print('AUC: {:0.3f}'.format(roc_auc_valid))

    fig = plt.figure(figsize = (10, 7.2))
    fpr, tpr, thresholds = roc_curve(y_valid, np.array(valid_pred_roc).reshape(-1))
    plt.title('ROC Curve for the validation data')
    plt.ylabel('True Positive Rate')
    plt.xlabel('False Positive Rate')
    plt.axis([0, 1, 0, 1])
    plt.plot([0, 1], [0, 1], linestyle = '--', label = 'No Skill')
    plt.plot(fpr, tpr, marker = '.', label = 'ROC curve (area = %0.3f)' % roc_auc_valid)
    plt.legend(loc = 'lower right')
    plt.show()
          
    print('ROC Curve for the test data'); print('--'*40)

    roc_auc_test = roc_auc_score(y_test, np.array(test_pred_roc).reshape(-1))
    print('AUC: {:0.3f}'.format(roc_auc_test))

    fig = plt.figure(figsize = (10, 7.2))
    fpr, tpr, thresholds = roc_curve(y_test, np.array(test_pred_roc).reshape(-1))
    plt.title('ROC Curve for the test data')
    plt.ylabel('True Positive Rate')
    plt.xlabel('False Positive Rate')
    plt.axis([0, 1, 0, 1])
    plt.plot([0, 1], [0, 1], linestyle = '--', label = 'Random')
    plt.plot(fpr, tpr, marker = '.', label = 'ROC curve (area = %0.3f)' % roc_auc_test)
    plt.legend(loc = 'lower right')
    plt.show()
          
    print('Classification Report on the test data'); print('--'*60)
    print(classification_report(y_test, test_pred, target_names = ['Normal', 'Pneumonia']))
          
    print('Classification Report on the validation data'); print('--'*60)
    print(classification_report(y_valid, valid_pred, target_names = ['Normal', 'Pneumonia']))
In [122]:
MODEL_WEIGHTS = os.path.join('model_weights/')
if not os.path.exists(MODEL_WEIGHTS): os.makedirs(MODEL_WEIGHTS)
In [123]:
print('Lets fit the model.....')
K.clear_session()
model = buildModel(DenseNet121)
callbacks = callback_model()
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the model.....
<tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f9b68e11780> False
<tensorflow.python.keras.layers.convolutional.ZeroPadding2D object at 0x7f9b68e2d400> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b68e2d908> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b68e2df28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b68e3ea90> False
<tensorflow.python.keras.layers.convolutional.ZeroPadding2D object at 0x7f9b68e3ea58> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f9b68cdff60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69023ba8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b690368d0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69036860> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b695a5860> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b695b4128> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b695b40f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b69638400> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69638240> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69645550> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b696454e0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b696c94e0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b696d51d0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69726cf8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b69740f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69740dd8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69759208> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b697b7e48> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b697d0f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b697e7a58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b697e7a20> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b69a1cd30> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69a1cc50> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69a2de80> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69a2de10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69aaee10> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69abf710> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69abf6d8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b69b3e9e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69b3e828> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69b4bb38> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69b4bac8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69bcfa90> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69bdd390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69bdd358> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b69c61668> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69c614a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69c667b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69c66748> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69d14748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69d14f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69d1e0f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b69dbf320> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69dbf160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69dcf470> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69dcf400> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f9b69e78400> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69e78b70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69e825c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69e82550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a1eb550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a1ef048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a248dd8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6a261f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a2780f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a288278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a288208> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a318f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a32dac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a32da90> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6a3a7da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a3a7cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a3b7ef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a3b7e80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a45ae80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a46e780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a46e748> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6a4ffa58> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a4ff898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a513ba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a513b38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a5b9b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a5c9438> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a5c9400> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6a65b710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a65b550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a674860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a6747f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a7197f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a723208> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a7231d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6a7cc3c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a7cc208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a7d7518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a7d74a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a87f4a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a889198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a903cc0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6a91def0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a91df98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a931fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a9b7e10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a9d0fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a9e8a20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a9e89e8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6aadbcf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6aadbc18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6aaede48> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6aaeddd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ab6cdd8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ab7b6d8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ab7b6a0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6ac809b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ac807f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ac91b00> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ac91a90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ad11a90> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ad1f390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ad1f358> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6af4d668> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6af4d4a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6af547b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6af54748> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6afde748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6afdef98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6afe50f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6b22d320> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b22d160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b23c470> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b23c400> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b2c1400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b2cb0f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b3c2f98> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6b3def98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b3deef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b3f4f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b456cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b471f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b486978> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b486940> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6b5b2c50> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b5b2b70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b5c6da0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b5c6d30> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f9b6b76ad30> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b76ae10> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b77aef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b77ae80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b7f9e80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b80b780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b80b748> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6b888a58> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b888898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b898ba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6b898b38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ba1ab38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ba2c438> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ba2c400> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6baae710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6baae550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6bab4860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6bab47f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6bb3b7f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6bb4b208> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6bb4b1d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6bbf63c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6bbf6208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6bbff518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6bbff4a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6bc864a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6bc8c198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6be3bcc0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6be55ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6be55f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6be6cfd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6becee10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6bee7fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6befda20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6befd9e8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6bfdacf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6bfdac18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6c10ae48> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c10add8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6760f400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b66c73710> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b66c73588> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b66c80860> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b66c80a20> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b66c79b38> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b68c88278> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b681afe48> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b68c7bda0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b68c7be10> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b68ff2fd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b68ff20f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b68fd20f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b68c98400> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b68fede80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9d306b94e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c5aae80> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6c5c2fd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6c5d90b8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6c5ec320> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c5ec2b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6c75bfd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6c773b70> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c773b38> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6c7eee48> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6c7eed68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6c840080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c89ff28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6c8bcf28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6c8cf828> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c8cf7f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6c953b00> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6c953940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6c95fc50> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c95fbe0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ca28be0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ca3b4e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ca3b4a8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6cc2f7b8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6cc2f5f8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6cc38908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6cc38898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ccbd898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6cccc198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6cccc160> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6cd9c470> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6cd9c2b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6cda65c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6cda6550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6cf63550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6cf67048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d00edd8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6d028f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d03d0f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d051278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d051208> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d0bdf28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d202ac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d202a90> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6d2d0da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d2d0cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d2e3ef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d2e3e80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d363e80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d375780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d375748> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6d44ba58> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d44b898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d45bba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d45bb38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d4dcb38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d4ed438> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d4ed400> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6d5cb710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d5cb550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d5d0860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d5d07f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d67e7f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d688208> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d6881d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6d8873c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d887208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d892518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d8924a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d93b4a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d9a8198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d9facc0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6da16ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6da16f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6da2bfd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6dc25e10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6dc41fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6dc54a20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6dc549e8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6dd39cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6dd39c18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6dd4ce48> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6dd4cdd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ddccdd8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6dddc6d8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6dddc6a0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6dec89b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6dec87f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ded8b00> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ded8a90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6df79a90> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6df8e390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6df8e358> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6e187668> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e1874a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e18c7b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e18c748> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e23c748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e23cf98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e2450f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6e341320> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e341160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e375470> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e375400> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e50b400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e5180f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e5dff98> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6e703f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e703ef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e718f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e79fcf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e7b8f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e7c7978> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e7c7940> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6e8c4c50> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e8c4b70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e8d7da0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e8d7d30> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f9b6e97fd30> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6e97fe10> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6e990ef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6e990e80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ed47e80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ed57780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ed57748> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6edd9a58> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6edd9898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ede7ba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ede7b38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6eea9b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6eeb8438> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6eeb8400> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6ef40710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ef40550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ef93860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ef937f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f03a7f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f044208> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f0441d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6f21e3c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f21e208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f229518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f2294a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f2d64a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f2dc198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f37dcc0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6f396ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f396f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f3adfd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f435e10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f44cfd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f464a20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f4649e8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6f531cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f531c18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f543e48> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f543dd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f71ddd8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f7306d8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f7306a0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6f8019b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f8017f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f810b00> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f810a90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f893a90> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6f8a3390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6f8a3358> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6faa7668> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6faa74a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6faaf7b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6faaf748> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6fb3a748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6fb3af98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6fb410f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6fc1f320> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6fc1f160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6fc5b470> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6fc5b400> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6fe05400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6fe110f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6fec1f98> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6feddf98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6feddef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6fef1f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ff53cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6ff6ef28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ff83978> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ff83940> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b70183c50> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70183b70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70195da0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70195d30> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b7023bd30> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7024c630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7024c5f8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b70330908> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70330748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7033fa58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7033f9e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b703e79e8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b703f82e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b703f82b0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b704e45c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b704e4400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b704ea710> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b704ea6a0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b7058e6a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7058ef60> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b705a6390> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b707a0f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b707b7160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b707c93c8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b707c9358> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b7097b358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7098a048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70a4def0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b70a6cef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70a6ce10> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70a7eef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70a7ef60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70b1efd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70b358d0> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70b35898> True
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b70c2bba8> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70c2bac8> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70c39cf8> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70c39c88> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70e07c88> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70e18588> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70e18550> True
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b71814860> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b718146a0> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b718249e8> True
Model: "DenseNet121"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet121 (Model)          (None, 7, 7, 1024)        7037504   
_________________________________________________________________
global_average_pooling2d (Gl (None, 1024)              0         
_________________________________________________________________
dropout (Dropout)            (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 1)                 1025      
=================================================================
Total params: 7,038,529
Trainable params: 206,017
Non-trainable params: 6,832,512
_________________________________________________________________
in call backs
Epoch 1/10
 9/10 [==========================>...] - ETA: 28s - loss: 0.9198 - accuracy: 0.4923 - average_precision: 0.4559 - f1_score: 0.4751
Epoch 00001: val_loss improved from inf to 0.71375, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 302s 30s/step - loss: 0.8911 - accuracy: 0.4966 - average_precision: 0.4633 - f1_score: 0.4821 - val_loss: 0.7137 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 2/10
 9/10 [==========================>...] - ETA: 26s - loss: 0.8192 - accuracy: 0.5154 - average_precision: 0.5826 - f1_score: 0.5387
Epoch 00002: val_loss improved from 0.71375 to 0.71141, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 289s 29s/step - loss: 0.8064 - accuracy: 0.5274 - average_precision: 0.5815 - f1_score: 0.5419 - val_loss: 0.7114 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 3/10
 9/10 [==========================>...] - ETA: 28s - loss: 0.8357 - accuracy: 0.5000 - average_precision: 0.4791 - f1_score: 0.4736
Epoch 00003: val_loss improved from 0.71141 to 0.70960, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 310s 31s/step - loss: 0.8406 - accuracy: 0.4932 - average_precision: 0.4680 - f1_score: 0.4700 - val_loss: 0.7096 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 4/10
 9/10 [==========================>...] - ETA: 27s - loss: 0.7574 - accuracy: 0.5231 - average_precision: 0.5272 - f1_score: 0.5345
Epoch 00004: val_loss improved from 0.70960 to 0.70890, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 298s 30s/step - loss: 0.7422 - accuracy: 0.5377 - average_precision: 0.5307 - f1_score: 0.5431 - val_loss: 0.7089 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 5/10
 9/10 [==========================>...] - ETA: 27s - loss: 0.7005 - accuracy: 0.5538 - average_precision: 0.4800 - f1_score: 0.4796
Epoch 00005: val_loss did not improve from 0.70890
10/10 [==============================] - 294s 29s/step - loss: 0.7149 - accuracy: 0.5377 - average_precision: 0.4903 - f1_score: 0.4741 - val_loss: 0.7089 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 6/10
 9/10 [==========================>...] - ETA: 26s - loss: 0.7940 - accuracy: 0.5308 - average_precision: 0.5960 - f1_score: 0.5385
Epoch 00006: val_loss improved from 0.70890 to 0.70649, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 291s 29s/step - loss: 0.7869 - accuracy: 0.5411 - average_precision: 0.5926 - f1_score: 0.5447 - val_loss: 0.7065 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 7/10
 9/10 [==========================>...] - ETA: 26s - loss: 0.7647 - accuracy: 0.6038 - average_precision: 0.5500 - f1_score: 0.5558
Epoch 00007: val_loss improved from 0.70649 to 0.70305, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 284s 28s/step - loss: 0.7563 - accuracy: 0.5959 - average_precision: 0.5350 - f1_score: 0.5446 - val_loss: 0.7030 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 8/10
 9/10 [==========================>...] - ETA: 24s - loss: 0.7279 - accuracy: 0.5692 - average_precision: 0.5212 - f1_score: 0.5434
Epoch 00008: val_loss did not improve from 0.70305
10/10 [==============================] - 260s 26s/step - loss: 0.7260 - accuracy: 0.5753 - average_precision: 0.5327 - f1_score: 0.5429 - val_loss: 0.7035 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 9/10
 9/10 [==========================>...] - ETA: 24s - loss: 0.6506 - accuracy: 0.6462 - average_precision: 0.6204 - f1_score: 0.6140
Epoch 00009: val_loss improved from 0.70305 to 0.70255, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 265s 26s/step - loss: 0.6573 - accuracy: 0.6541 - average_precision: 0.6450 - f1_score: 0.6269 - val_loss: 0.7026 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 10/10
 9/10 [==========================>...] - ETA: 25s - loss: 0.6978 - accuracy: 0.6000 - average_precision: 0.5664 - f1_score: 0.5425
Epoch 00010: val_loss improved from 0.70255 to 0.70183, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 272s 27s/step - loss: 0.7003 - accuracy: 0.6027 - average_precision: 0.5631 - f1_score: 0.5454 - val_loss: 0.7018 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Save the final weights
--------------------------------------------------------------------------------
In [132]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 0.702, Accuracy: 0.556,  AP: 0.0, F1 Score: 0.0
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 14s 7s/step
In [133]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 15s 8s/step
0    20
1    16
dtype: int64
0    20
1    16
dtype: int64
Correctly predicted 20 images out of 36 images
Predicted 56% test images correctly
In [126]:
def viewPredictedImage(correct):
    for c in correct[:4]:
        f, ((ax1)) = plt.subplots(1, 1, figsize = (15, 8))
        img = load_image(x_test[c])
        img = cv2.resize(img, dsize=(IMAGE_HEIGHT, IMAGE_WIDTH), interpolation=cv2.INTER_CUBIC)
        ax1.imshow(img, cmap = plt.cm.bone)
        ax1.set_title("Predicted Class {},Actual Class {}".format(valid_pred[c], y_test[c]))
        ax1.axis('off')
        plt.show()
In [127]:
viewPredictedImage(correct)
In [128]:
viewPredictedImage(incorrect)
In [134]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.369
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.531
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.56      1.00      0.71        20
   Pneumonia       0.00      0.00      0.00        16

    accuracy                           0.56        36
   macro avg       0.28      0.50      0.36        36
weighted avg       0.31      0.56      0.40        36

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.56      1.00      0.71        20
   Pneumonia       0.00      0.00      0.00        16

    accuracy                           0.56        36
   macro avg       0.28      0.50      0.36        36
weighted avg       0.31      0.56      0.40        36

In [135]:
print('Reloading model weights and lets evaluate on validation set'); print('--'*40)
model.load_weights(MODEL_WEIGHTS + BEST_MODEL)
Reloading model weights and lets evaluate on validation set
--------------------------------------------------------------------------------

Model 2 VGG 16

In [136]:
print('Lets fit the VGG16 model.....')
K.clear_session()
model = buildModel(VGG16)
callbacks = callback_model()
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the VGG16 model.....
<tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f9db989a198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7948b198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b794a24a8> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f9b794a6710> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b794a67f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b72f08f60> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f9b72f32278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b72f32358> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b72f3dac8> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b72f93da0> True
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f9b72fa7f28> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b72fb2198> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7301a908> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b73024be0> True
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f9b73036eb8> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b73036f98> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b74094748> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7409ca20> True
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f9b740b3cf8> True
Model: "DenseNet121"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Model)                (None, 7, 7, 512)         14714688  
_________________________________________________________________
global_average_pooling2d (Gl (None, 512)               0         
_________________________________________________________________
dropout (Dropout)            (None, 512)               0         
_________________________________________________________________
dense (Dense)                (None, 1)                 513       
=================================================================
Total params: 14,715,201
Trainable params: 14,455,041
Non-trainable params: 260,160
_________________________________________________________________
in call backs
Epoch 1/10
 9/10 [==========================>...] - ETA: 31s - loss: 0.7387 - accuracy: 0.4962 - average_precision: 0.5239 - f1_score: 0.5597 
Epoch 00001: val_loss improved from inf to 0.71658, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 333s 33s/step - loss: 0.7545 - accuracy: 0.4726 - average_precision: 0.4946 - f1_score: 0.5380 - val_loss: 0.7166 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 2/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.7116 - accuracy: 0.4846 - average_precision: 0.5546 - f1_score: 0.4883 
Epoch 00002: val_loss improved from 0.71658 to 0.69497, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 328s 33s/step - loss: 0.7101 - accuracy: 0.4897 - average_precision: 0.5554 - f1_score: 0.4940 - val_loss: 0.6950 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 3/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.7020 - accuracy: 0.5269 - average_precision: 0.5377 - f1_score: 0.5615
Epoch 00003: val_loss improved from 0.69497 to 0.69470, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 323s 32s/step - loss: 0.6984 - accuracy: 0.5445 - average_precision: 0.5402 - f1_score: 0.5697 - val_loss: 0.6947 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 4/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.7070 - accuracy: 0.4692 - average_precision: 0.4361 - f1_score: 0.3521
Epoch 00004: val_loss improved from 0.69470 to 0.69218, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 325s 33s/step - loss: 0.7049 - accuracy: 0.4897 - average_precision: 0.4591 - f1_score: 0.3690 - val_loss: 0.6922 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 5/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.6951 - accuracy: 0.4462 - average_precision: 0.4518 - f1_score: 0.3759 
Epoch 00005: val_loss did not improve from 0.69218
10/10 [==============================] - 321s 32s/step - loss: 0.6957 - accuracy: 0.4349 - average_precision: 0.4399 - f1_score: 0.3605 - val_loss: 0.6925 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 6/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.6932 - accuracy: 0.4962 - average_precision: 0.4598 - f1_score: 0.3800 
Epoch 00006: val_loss did not improve from 0.69218
10/10 [==============================] - 322s 32s/step - loss: 0.6931 - accuracy: 0.5068 - average_precision: 0.4749 - f1_score: 0.4048 - val_loss: 0.6934 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 7/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.6927 - accuracy: 0.4692 - average_precision: 0.4859 - f1_score: 0.5436
Epoch 00007: val_loss did not improve from 0.69218
10/10 [==============================] - 320s 32s/step - loss: 0.6928 - accuracy: 0.4760 - average_precision: 0.4923 - f1_score: 0.5487 - val_loss: 0.6935 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 8/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.6921 - accuracy: 0.5077 - average_precision: 0.5019 - f1_score: 0.5600
Epoch 00008: val_loss did not improve from 0.69218
10/10 [==============================] - 320s 32s/step - loss: 0.6923 - accuracy: 0.5103 - average_precision: 0.5080 - f1_score: 0.5585 - val_loss: 0.6934 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 9/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.6958 - accuracy: 0.4385 - average_precision: 0.4323 - f1_score: 0.3496
Epoch 00009: val_loss did not improve from 0.69218

Epoch 00009: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
10/10 [==============================] - 320s 32s/step - loss: 0.6957 - accuracy: 0.4384 - average_precision: 0.4462 - f1_score: 0.3454 - val_loss: 0.6924 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 10/10
 9/10 [==========================>...] - ETA: 30s - loss: 0.6924 - accuracy: 0.5000 - average_precision: 0.4574 - f1_score: 0.2139
Epoch 00010: val_loss did not improve from 0.69218
10/10 [==============================] - 320s 32s/step - loss: 0.6927 - accuracy: 0.5000 - average_precision: 0.4617 - f1_score: 0.2198 - val_loss: 0.6924 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Save the final weights
--------------------------------------------------------------------------------
In [137]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 0.692, Accuracy: 0.556,  AP: 0.0, F1 Score: 0.0
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 12s 6s/step
In [138]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 12s 6s/step
0    20
1    16
dtype: int64
0    20
1    16
dtype: int64
Correctly predicted 20 images out of 36 images
Predicted 56% test images correctly
In [139]:
viewPredictedImage(correct)
In [140]:
viewPredictedImage(incorrect)
In [141]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.541
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.475
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.56      1.00      0.71        20
   Pneumonia       0.00      0.00      0.00        16

    accuracy                           0.56        36
   macro avg       0.28      0.50      0.36        36
weighted avg       0.31      0.56      0.40        36

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.56      1.00      0.71        20
   Pneumonia       0.00      0.00      0.00        16

    accuracy                           0.56        36
   macro avg       0.28      0.50      0.36        36
weighted avg       0.31      0.56      0.40        36

In [142]:
print('Lets fit the Restnet model.....')
K.clear_session()
model = buildModel(ResNet50)
callbacks = callback_model()
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the Restnet model.....
<tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f9897ca1748> False
<tensorflow.python.keras.layers.convolutional.ZeroPadding2D object at 0x7f9896e45d68> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897caed68> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9896e59b70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9896e65c18> False
<tensorflow.python.keras.layers.convolutional.ZeroPadding2D object at 0x7f9896e65ba8> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f9896ebff60> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9896ef8898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9897e62400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9897e62fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897e66470> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898014470> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989801c550> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9896edd9e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989801c4e0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9896ef47f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98980a24e0> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f98980a8588> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98980a8400> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98980a84a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898134908> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98981419e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898141978> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98984e4978> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898611a58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98986119e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98986909e8> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f989869fac8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989869f940> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989869f9e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898820e48> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989882df28> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989888feb8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98988aceb8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98988d4080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898947f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898964f28> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9898989048> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898975e80> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898b24f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898c9bfd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898cb0f60> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898d12cc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898d2aef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898d42fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98989e4f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898e32da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898b243c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898e4af60> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9898e61cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898e76048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898ec5e10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898ef4550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898ef9630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898ef95c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898fc15c0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898fce6a0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898fce630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899241630> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9899248710> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899248588> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899248630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98992d3a90> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98992e2b70> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98992e2b00> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98994a2b00> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98994b1be0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98994b1b70> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899603b70> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9899611c50> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899611ac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899611b70> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899692fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98996a3f60> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899746cc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989975eef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899775fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98998a9da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98998bff60> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f98998d7cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899a01048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899ace630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899f3af98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899f56d68> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899f67208> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a062208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a062dd8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899a58e10> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a078278> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899ac9550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a577278> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f989a577e48> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a5841d0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a584278> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a80b6d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a8107b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897c82358> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9897cae1d0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9896e40cf8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98978c84a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98978d1a58> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f989776b2e8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9897734a58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989776b860> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98974e5208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98975da4a8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897085898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9896f7f358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989706bb38> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9db820bd68> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b79467c18> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9d2734ff98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9db95db5c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9d307260b8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9db7adc7b8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7a15d390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7a152da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b7a1836d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7a016f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9db9961128> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b795db5c0> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9b79529be0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b79529d30> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b79529cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b66c87128> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b68fd2b00> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b718f3198> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b790b7b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9d306d25f8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9d306ea898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b7955cc18> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9b790d2c18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b790d2710> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ac80908> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b681f1cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b67689ac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b68c985c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9d352ca6a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b68e3ee48> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9d3059c5f8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b681c1278> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9b681a5898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6818ab70> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b66c7d438> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78f11588> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b68e2f7b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9dcc35d748> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b7a01d320> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b71977fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6818a390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b719778d0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b681b9550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78e31d30> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9b78e0ce10> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78e0cc88> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b78e0cd30> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9896f2d240> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9896f07da0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b79ffb240> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b795bb240> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b795bbe10> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b795872b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b718912b0> False
<tensorflow.python.keras.layers.merge.Add object at 0x7f9b71891e80> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7188d208> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7188d2b0> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78f5f710> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78f467f0> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b78f46780> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a1f5780> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a1cf860> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a1cf7f0> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6fe157f0> True
<tensorflow.python.keras.layers.merge.Add object at 0x7f9b6ff018d0> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6ff01748> True
Model: "DenseNet121"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
resnet50 (Model)             (None, 7, 7, 2048)        23587712  
_________________________________________________________________
global_average_pooling2d (Gl (None, 2048)              0         
_________________________________________________________________
dropout (Dropout)            (None, 2048)              0         
_________________________________________________________________
dense (Dense)                (None, 1)                 2049      
=================================================================
Total params: 23,589,761
Trainable params: 4,467,713
Non-trainable params: 19,122,048
_________________________________________________________________
in call backs
Epoch 1/10
 9/10 [==========================>...] - ETA: 18s - loss: 0.7808 - accuracy: 0.5192 - average_precision: 0.5559 - f1_score: 0.4383
Epoch 00001: val_loss improved from inf to 0.96196, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 201s 20s/step - loss: 0.7810 - accuracy: 0.5240 - average_precision: 0.5574 - f1_score: 0.4478 - val_loss: 0.9620 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 2/10
 9/10 [==========================>...] - ETA: 16s - loss: 0.5981 - accuracy: 0.7000 - average_precision: 0.6778 - f1_score: 0.6710
Epoch 00002: val_loss did not improve from 0.96196
10/10 [==============================] - 184s 18s/step - loss: 0.5821 - accuracy: 0.7123 - average_precision: 0.6900 - f1_score: 0.6881 - val_loss: 1.0214 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 3/10
 9/10 [==========================>...] - ETA: 16s - loss: 0.5925 - accuracy: 0.7192 - average_precision: 0.7313 - f1_score: 0.7490
Epoch 00003: val_loss did not improve from 0.96196
10/10 [==============================] - 178s 18s/step - loss: 0.5842 - accuracy: 0.7226 - average_precision: 0.7282 - f1_score: 0.7519 - val_loss: 1.0371 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 4/10
 9/10 [==========================>...] - ETA: 16s - loss: 0.4924 - accuracy: 0.7462 - average_precision: 0.7488 - f1_score: 0.7699
Epoch 00004: val_loss did not improve from 0.96196
10/10 [==============================] - 174s 17s/step - loss: 0.5114 - accuracy: 0.7397 - average_precision: 0.7476 - f1_score: 0.7666 - val_loss: 1.0262 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 5/10
 9/10 [==========================>...] - ETA: 16s - loss: 0.5385 - accuracy: 0.7577 - average_precision: 0.7640 - f1_score: 0.7393
Epoch 00005: val_loss did not improve from 0.96196
10/10 [==============================] - 175s 17s/step - loss: 0.5400 - accuracy: 0.7568 - average_precision: 0.7598 - f1_score: 0.7418 - val_loss: 1.0374 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 6/10
 9/10 [==========================>...] - ETA: 17s - loss: 0.5450 - accuracy: 0.7192 - average_precision: 0.6863 - f1_score: 0.7043
Epoch 00006: val_loss did not improve from 0.96196

Epoch 00006: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
10/10 [==============================] - 188s 19s/step - loss: 0.5337 - accuracy: 0.7295 - average_precision: 0.7059 - f1_score: 0.7172 - val_loss: 1.1012 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 7/10
 9/10 [==========================>...] - ETA: 16s - loss: 0.5118 - accuracy: 0.7654 - average_precision: 0.6709 - f1_score: 0.6927
Epoch 00007: val_loss did not improve from 0.96196
10/10 [==============================] - 175s 17s/step - loss: 0.5022 - accuracy: 0.7740 - average_precision: 0.6828 - f1_score: 0.7092 - val_loss: 1.1215 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 8/10
 9/10 [==========================>...] - ETA: 16s - loss: 0.5040 - accuracy: 0.7231 - average_precision: 0.7475 - f1_score: 0.7505
Epoch 00008: val_loss did not improve from 0.96196
10/10 [==============================] - 179s 18s/step - loss: 0.4974 - accuracy: 0.7329 - average_precision: 0.7497 - f1_score: 0.7523 - val_loss: 1.1421 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 9/10
 9/10 [==========================>...] - ETA: 16s - loss: 0.5065 - accuracy: 0.7654 - average_precision: 0.7694 - f1_score: 0.7949
Epoch 00009: val_loss did not improve from 0.96196
10/10 [==============================] - 175s 18s/step - loss: 0.4983 - accuracy: 0.7671 - average_precision: 0.7724 - f1_score: 0.7974 - val_loss: 1.1602 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 10/10
 9/10 [==========================>...] - ETA: 15s - loss: 0.4365 - accuracy: 0.8154 - average_precision: 0.8181 - f1_score: 0.8294
Epoch 00010: val_loss did not improve from 0.96196
10/10 [==============================] - 172s 17s/step - loss: 0.4500 - accuracy: 0.8116 - average_precision: 0.8300 - f1_score: 0.8276 - val_loss: 1.1791 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Save the final weights
--------------------------------------------------------------------------------
In [143]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 1.179, Accuracy: 0.444,  AP: 0.469, F1 Score: 0.638
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 7s 4s/step
In [144]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 8s 4s/step
0    20
1    16
dtype: int64
0    20
1    16
dtype: int64
Correctly predicted 16 images out of 36 images
Predicted 44% test images correctly
In [145]:
viewPredictedImage(correct)
In [146]:
viewPredictedImage(incorrect)
In [147]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.494
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.450
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        20
   Pneumonia       0.44      1.00      0.62        16

    accuracy                           0.44        36
   macro avg       0.22      0.50      0.31        36
weighted avg       0.20      0.44      0.27        36

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        20
   Pneumonia       0.44      1.00      0.62        16

    accuracy                           0.44        36
   macro avg       0.22      0.50      0.31        36
weighted avg       0.20      0.44      0.27        36

In [148]:
print('Lets fit the InceptionV3 model.....')
K.clear_session()
model = buildModel(InceptionV3)
callbacks = callback_model()
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the InceptionV3 model.....
<tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f98975da630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989eb6d710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989eb6dfd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989eb79b70> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989eb79a90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ecfbba8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ed0a4a8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ed0a588> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f5a5550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989f5b1048> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f989f5fdeb8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f61bcc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f62c4a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989f63a160> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f62ce48> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f6a5dd8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989f6b56d8> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f989f6b57b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f8cb908> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f943898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78b59a58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f7c55f8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f95d0b8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f8425c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98c4555e48> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989f848080> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989eb38828> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f989e7decf8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f79bb00> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f89df28> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989eb38908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989e7850f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f7b9c18> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f8b6f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e8cd358> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e785cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989f7c5518> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989f8cba58> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e7de898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e79fe48> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989e79f780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6daa1cc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e8de518> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e7a9198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98978becf8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a99b780> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a8d0080> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989aab3be0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b71b618d0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98c4570588> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f989aa309e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989e66d9e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b79467240> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989aedcc88> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b71bb6780> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a9de0f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b71bd02e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989aa35d68> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b71bb6f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98978e8fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6daa1a58> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989aa30b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b71b9ea58> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b71b9e908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b72f10be0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b69f7cba8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69f8c4a8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b71b468d0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69f8c588> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b740a1898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b730a5550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989d23f198> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b730b0048> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f989ac60898> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b71b60f60> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989d23f278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ac35eb8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989acec630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b71b56ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b719d0f60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ac51eb8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989acecdd8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b71b46a20> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b72f10d30> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ac609e8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989acfd6d8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989acfd7b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ade8780> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ae64668> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ae6f0b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989afa7fd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989afc2fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989afd8b00> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989adb8a58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989afd89b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989add8da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989d0a8978> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ade86a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989d0b7278> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f989d0b7358> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989d12e6a0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ec04390> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ec6df98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0269d68> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0269c18> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a02debe0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a030d4e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989d1557b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a030d5c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e4b66a0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a04aa588> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e4bd0f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a04b1048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989e4bd048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0502ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e5b5d68> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a0520ef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e5c9b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a05cfa20> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98a07f0e10> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989d1490f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989e5c99e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a05cf8d0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0807ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989d149dd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e8599b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a06e9898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a081d6d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989d1556d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ec042b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a06fa198> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0825128> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a0825048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0cb8978> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a0ed5860> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0ee4160> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0fabdd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a0fc3f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0fdbc18> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0937f28> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0fdbac8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a0a3ee48> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a10539b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0a51978> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a1064390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0a51828> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1064438> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a0b7e710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a117bf98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0b8a0b8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a1190d68> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98a12ca4e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a081d668> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a0c8bf98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1190c18> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a14d6a90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a0923dd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a0ca4f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a12b9b00> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a14ed940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0946080> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a0cb8ac8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a12ca400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a14f7240> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a14f7320> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a18c3c18> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a1ae1b00> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a1aef400> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1aef4e0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a3b673c8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a3b740b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1664dd8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a3b67d68> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a167ef98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a3c8dc50> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a1692c18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a3c9c550> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1692ac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a3c9c630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a17869b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a3dcb518> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a17962b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a3dd6208> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98a3f02780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1562588> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1796390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a3dcbeb8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a4108cc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a1579860> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a18abf98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a3ef1da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a4125be0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a1586160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a18c3d68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a3f026a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a41314e0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a41315c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a4757eb8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a48cdda0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a48df6a0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a48df780> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a495b668> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a49600b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a41be4e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a4badef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a42cd3c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a4bcbef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a42d90b8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a4bdca20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a42cdd68> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a4bdc8d0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a43d5c50> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a4d557b8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a43e7550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a4d610b8> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98a4edea20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a418f780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a43e7630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a4eaccc0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a50516d8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a41b1b00> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a4757518> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a4ec3ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a5051e80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a41be400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a4764208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a4edeb70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a50639b0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a5063860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a53e3eb8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a5401eb8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a54139e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a5413898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a5521780> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a5529198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a50bda20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a586fc88> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a50e0d68> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a588cd68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a50f0668> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a589eb38> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a50f0748> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a589e9e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a516e630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a5a148d0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a5174080> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a5a241d0> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f98a5e3fe48> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a5e59f28> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a6d205f8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a6f61438> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a6f6a128> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a5e81630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a6f61dd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a63ae518> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a7208cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a63b5208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a72165c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a63aeeb8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a69d7780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a72166a0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a7257e48> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98a8984eb8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a5e59e48> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a69c5da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a6d20668> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a7257588> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a72cde10> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a89a69b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a5e72c50> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a69d76a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a6d2b0f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a725c048> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a8777940> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a89a6f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a5e81550> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a6f2cef0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a87777f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a89b9c88> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a89b9b38> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a921ca20> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a945c860> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a946b160> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8a44a58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a94b9dd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a8d3e940> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a94d3f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a8d4c240> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a94e9c18> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8d4c320> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8dcbba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a94e9ac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98aa728390> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98aab60e80> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8a0ecc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a8db6f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a921ca90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98aa71b9b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98aa945f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98aab80dd8> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a8a30f28> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a8dcbcf8> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a922e390> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f98aa7282b0> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f98aa95bd68> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98aab955c0> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a8a44ba8> True
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a922e470> True
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98aa95bc18> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f98aaba0080> True
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98aab95f28> True
Model: "DenseNet121"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
inception_v3 (Model)         (None, 5, 5, 2048)        21802784  
_________________________________________________________________
global_average_pooling2d (Gl (None, 2048)              0         
_________________________________________________________________
dropout (Dropout)            (None, 2048)              0         
_________________________________________________________________
dense (Dense)                (None, 1)                 2049      
=================================================================
Total params: 21,804,833
Trainable params: 395,777
Non-trainable params: 21,409,056
_________________________________________________________________
in call backs
Epoch 1/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.7184 - accuracy: 0.5000 - average_precision: 0.5179 - f1_score: 0.4873
Epoch 00001: val_loss improved from inf to 0.72231, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 114s 11s/step - loss: 0.7199 - accuracy: 0.5000 - average_precision: 0.5187 - f1_score: 0.4942 - val_loss: 0.7223 - val_accuracy: 0.5000 - val_average_precision: 0.1875 - val_f1_score: 0.1364
Epoch 2/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.7227 - accuracy: 0.5192 - average_precision: 0.4850 - f1_score: 0.4757
Epoch 00002: val_loss did not improve from 0.72231
10/10 [==============================] - 108s 11s/step - loss: 0.7125 - accuracy: 0.5377 - average_precision: 0.5053 - f1_score: 0.4969 - val_loss: 0.7229 - val_accuracy: 0.4444 - val_average_precision: 0.1500 - val_f1_score: 0.1250
Epoch 3/10
 9/10 [==========================>...] - ETA: 9s - loss: 0.6823 - accuracy: 0.6308 - average_precision: 0.5980 - f1_score: 0.6040 
Epoch 00003: val_loss did not improve from 0.72231
10/10 [==============================] - 108s 11s/step - loss: 0.6786 - accuracy: 0.6336 - average_precision: 0.6240 - f1_score: 0.6122 - val_loss: 0.7228 - val_accuracy: 0.5000 - val_average_precision: 0.1875 - val_f1_score: 0.1364
Epoch 4/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.6465 - accuracy: 0.6154 - average_precision: 0.6466 - f1_score: 0.6013
Epoch 00004: val_loss did not improve from 0.72231
10/10 [==============================] - 107s 11s/step - loss: 0.6441 - accuracy: 0.6199 - average_precision: 0.6463 - f1_score: 0.6032 - val_loss: 0.7224 - val_accuracy: 0.4444 - val_average_precision: 0.1500 - val_f1_score: 0.1250
Epoch 5/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.6161 - accuracy: 0.6538 - average_precision: 0.6806 - f1_score: 0.6846
Epoch 00005: val_loss did not improve from 0.72231
10/10 [==============================] - 108s 11s/step - loss: 0.6186 - accuracy: 0.6575 - average_precision: 0.6858 - f1_score: 0.6849 - val_loss: 0.7227 - val_accuracy: 0.4167 - val_average_precision: 0.1364 - val_f1_score: 0.1200
Epoch 6/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.6306 - accuracy: 0.6538 - average_precision: 0.7000 - f1_score: 0.6572
Epoch 00006: val_loss did not improve from 0.72231

Epoch 00006: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
10/10 [==============================] - 108s 11s/step - loss: 0.6325 - accuracy: 0.6541 - average_precision: 0.6800 - f1_score: 0.6536 - val_loss: 0.7232 - val_accuracy: 0.3611 - val_average_precision: 0.1250 - val_f1_score: 0.1154
Epoch 7/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.6141 - accuracy: 0.6769 - average_precision: 0.6370 - f1_score: 0.6681
Epoch 00007: val_loss did not improve from 0.72231
10/10 [==============================] - 108s 11s/step - loss: 0.6086 - accuracy: 0.6781 - average_precision: 0.6420 - f1_score: 0.6700 - val_loss: 0.7232 - val_accuracy: 0.3611 - val_average_precision: 0.1250 - val_f1_score: 0.1154
Epoch 8/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.5799 - accuracy: 0.6808 - average_precision: 0.7074 - f1_score: 0.7044
Epoch 00008: val_loss did not improve from 0.72231
10/10 [==============================] - 108s 11s/step - loss: 0.5736 - accuracy: 0.6815 - average_precision: 0.7073 - f1_score: 0.7045 - val_loss: 0.7232 - val_accuracy: 0.3611 - val_average_precision: 0.1250 - val_f1_score: 0.1154
Epoch 9/10
 9/10 [==========================>...] - ETA: 11s - loss: 0.6433 - accuracy: 0.6319 - average_precision: 0.6430 - f1_score: 0.6484
Epoch 00009: val_loss did not improve from 0.72231
10/10 [==============================] - 108s 11s/step - loss: 0.6238 - accuracy: 0.6336 - average_precision: 0.6287 - f1_score: 0.6502 - val_loss: 0.7232 - val_accuracy: 0.3611 - val_average_precision: 0.1250 - val_f1_score: 0.1154
Epoch 10/10
 9/10 [==========================>...] - ETA: 10s - loss: 0.6436 - accuracy: 0.6654 - average_precision: 0.6571 - f1_score: 0.6868
Epoch 00010: val_loss did not improve from 0.72231
10/10 [==============================] - 109s 11s/step - loss: 0.6336 - accuracy: 0.6644 - average_precision: 0.6602 - f1_score: 0.6848 - val_loss: 0.7230 - val_accuracy: 0.3611 - val_average_precision: 0.1250 - val_f1_score: 0.1154
Save the final weights
--------------------------------------------------------------------------------
In [150]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 0.723, Accuracy: 0.361,  AP: 0.125, F1 Score: 0.115
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 5s 3s/step
In [151]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 7s 3s/step
0    20
1    16
dtype: int64
0    20
1    16
dtype: int64
Correctly predicted 15 images out of 36 images
Predicted 42% test images correctly
In [152]:
viewPredictedImage(correct)
In [153]:
viewPredictedImage(incorrect)
In [154]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.412
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.394
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.48      0.55      0.51        20
   Pneumonia       0.31      0.25      0.28        16

    accuracy                           0.42        36
   macro avg       0.39      0.40      0.39        36
weighted avg       0.40      0.42      0.41        36

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.43      0.50      0.47        20
   Pneumonia       0.23      0.19      0.21        16

    accuracy                           0.36        36
   macro avg       0.33      0.34      0.34        36
weighted avg       0.34      0.36      0.35        36

In [155]:
print('Lets fit the VGG19 model.....')
K.clear_session()
model = buildModel(VGG19)
callbacks = callback_model()
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the VGG19 model.....
<tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f98afd9b630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a88ee550> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a88eeeb8> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f98a88fbfd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8908240> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a89199b0> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f98a8925c88> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8925d68> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8c8e518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8c9b7f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8cacac8> True
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f98a9009da0> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a9009e80> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a9025630> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a9038908> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a9364be0> True
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f98a9376eb8> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a9376f98> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a9390748> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a93aba20> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a93becf8> True
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f98aa574fd0> True
Model: "DenseNet121"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg19 (Model)                (None, 7, 7, 512)         20024384  
_________________________________________________________________
global_average_pooling2d (Gl (None, 512)               0         
_________________________________________________________________
dropout (Dropout)            (None, 512)               0         
_________________________________________________________________
dense (Dense)                (None, 1)                 513       
=================================================================
Total params: 20,024,897
Trainable params: 18,289,409
Non-trainable params: 1,735,488
_________________________________________________________________
in call backs
Epoch 1/10
 9/10 [==========================>...] - ETA: 41s - loss: 0.7123 - accuracy: 0.5000 - average_precision: 0.5733 - f1_score: 0.5678 
Epoch 00001: val_loss improved from inf to 0.69634, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 440s 44s/step - loss: 0.7171 - accuracy: 0.4863 - average_precision: 0.5577 - f1_score: 0.5610 - val_loss: 0.6963 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 2/10
 9/10 [==========================>...] - ETA: 40s - loss: 0.7221 - accuracy: 0.4885 - average_precision: 0.4888 - f1_score: 0.4733 
Epoch 00002: val_loss improved from 0.69634 to 0.69247, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 432s 43s/step - loss: 0.7192 - accuracy: 0.4795 - average_precision: 0.4816 - f1_score: 0.4773 - val_loss: 0.6925 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 3/10
 9/10 [==========================>...] - ETA: 40s - loss: 0.6945 - accuracy: 0.4962 - average_precision: 0.5793 - f1_score: 0.4649 
Epoch 00003: val_loss did not improve from 0.69247
10/10 [==============================] - 420s 42s/step - loss: 0.6958 - accuracy: 0.4897 - average_precision: 0.5668 - f1_score: 0.4710 - val_loss: 0.6937 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 4/10
 9/10 [==========================>...] - ETA: 37s - loss: 0.6953 - accuracy: 0.4577 - average_precision: 0.3234 - f1_score: 0.2151 
Epoch 00004: val_loss improved from 0.69247 to 0.69182, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 400s 40s/step - loss: 0.6954 - accuracy: 0.4521 - average_precision: 0.2911 - f1_score: 0.1936 - val_loss: 0.6918 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 5/10
 9/10 [==========================>...] - ETA: 37s - loss: 0.6949 - accuracy: 0.4692 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 
Epoch 00005: val_loss improved from 0.69182 to 0.69168, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 400s 40s/step - loss: 0.6956 - accuracy: 0.4623 - average_precision: 0.0000e+00 - f1_score: 0.0000e+00 - val_loss: 0.6917 - val_accuracy: 0.5556 - val_average_precision: 0.0000e+00 - val_f1_score: 0.0000e+00
Epoch 6/10
 9/10 [==========================>...] - ETA: 37s - loss: 0.6975 - accuracy: 0.4808 - average_precision: 0.4248 - f1_score: 0.3237   
Epoch 00006: val_loss did not improve from 0.69168
10/10 [==============================] - 395s 39s/step - loss: 0.6975 - accuracy: 0.4726 - average_precision: 0.4263 - f1_score: 0.3450 - val_loss: 0.6943 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 7/10
 9/10 [==========================>...] - ETA: 37s - loss: 0.6942 - accuracy: 0.4808 - average_precision: 0.4950 - f1_score: 0.5606 
Epoch 00007: val_loss did not improve from 0.69168
10/10 [==============================] - 393s 39s/step - loss: 0.6949 - accuracy: 0.4760 - average_precision: 0.4812 - f1_score: 0.5572 - val_loss: 0.6951 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 8/10
 9/10 [==========================>...] - ETA: 37s - loss: 0.6877 - accuracy: 0.5615 - average_precision: 0.5596 - f1_score: 0.7026 
Epoch 00008: val_loss did not improve from 0.69168
10/10 [==============================] - 394s 39s/step - loss: 0.6896 - accuracy: 0.5479 - average_precision: 0.5503 - f1_score: 0.6932 - val_loss: 0.6985 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 9/10
 9/10 [==========================>...] - ETA: 37s - loss: 0.6927 - accuracy: 0.5038 - average_precision: 0.5380 - f1_score: 0.6716 
Epoch 00009: val_loss did not improve from 0.69168
10/10 [==============================] - 401s 40s/step - loss: 0.6934 - accuracy: 0.5000 - average_precision: 0.5287 - f1_score: 0.6630 - val_loss: 0.6982 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 10/10
 9/10 [==========================>...] - ETA: 38s - loss: 0.6928 - accuracy: 0.5269 - average_precision: 0.5169 - f1_score: 0.6689 
Epoch 00010: val_loss did not improve from 0.69168

Epoch 00010: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
10/10 [==============================] - 400s 40s/step - loss: 0.6927 - accuracy: 0.5205 - average_precision: 0.5136 - f1_score: 0.6658 - val_loss: 0.6967 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Save the final weights
--------------------------------------------------------------------------------
In [156]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 0.697, Accuracy: 0.444,  AP: 0.469, F1 Score: 0.638
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 14s 7s/step
In [157]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 14s 7s/step
0    20
1    16
dtype: int64
0    20
1    16
dtype: int64
Correctly predicted 16 images out of 36 images
Predicted 44% test images correctly
In [158]:
viewPredictedImage(correct)
In [159]:
viewPredictedImage(incorrect)
In [160]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.453
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.419
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        20
   Pneumonia       0.44      1.00      0.62        16

    accuracy                           0.44        36
   macro avg       0.22      0.50      0.31        36
weighted avg       0.20      0.44      0.27        36

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        20
   Pneumonia       0.44      1.00      0.62        16

    accuracy                           0.44        36
   macro avg       0.22      0.50      0.31        36
weighted avg       0.20      0.44      0.27        36

In [161]:
print('Lets fit the DenseNet169 model.....')
K.clear_session()
model = buildModel(DenseNet169)
callbacks = callback_model()
train_generator = generators.train_generator
validation_generator = generators.valid_generator
test_generator = generators.test_generator    
history = model.fit_generator(generator = train_generator, 
                              steps_per_epoch = generators.step_size_train,
                              epochs = EPOCH, verbose = VERBOSE, 
                              callbacks = callbacks,
                              validation_data = validation_generator, 
                              validation_steps = generators.step_size_valid)
print('Save the final weights'); print('--'*40)
model.save(MODEL_WEIGHTS + FINAL_MODEL)
Lets fit the DenseNet169 model.....
<tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f98b09e27b8> False
<tensorflow.python.keras.layers.convolutional.ZeroPadding2D object at 0x7f98b09e2c88> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b09f14a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b09b9358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b0b0d710> False
<tensorflow.python.keras.layers.convolutional.ZeroPadding2D object at 0x7f98b0b0d5c0> False
<tensorflow.python.keras.layers.pooling.MaxPooling2D object at 0x7f98afde1f60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98afe05d68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98afe18438> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98afe183c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98afe99390> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98afea7080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98afef3f28> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b0011f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b0011e48> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b0024f28> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b0024fd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b00a1ef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b00b5908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b00b58d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b0131be0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b0131b00> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b0141d30> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b0141cc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b0b47cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b0b575c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b0b57588> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b0d0f898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b0d0f6d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b0d1f9e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b0d1f978> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b100b978> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b101b278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b101b240> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b109d550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b109d390> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b10a06a0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b10a0630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1246630> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1246e80> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1246f98> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b12bff28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b12d30f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b12e7358> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b12e72e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b13662e8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1366ba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1366b70> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b13dfe80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b13dfda0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1507048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1554f60> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98b1571f60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1571f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1586fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b15e5da0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b15fff60> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b16149b0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1614978> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b1690c88> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1690ba8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b16a2dd8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b16a2d68> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1723d68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1733668> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1733630> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b17b2940> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b17b2780> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b17c1a90> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b17c1a20> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1945a20> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1954320> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b19542e8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b19d35f8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b19d3438> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b19d9748> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b19d96d8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1b686d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1b68f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1b6f080> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b1bdefd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1bf6198> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1c05400> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1c05390> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1c8a390> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1c9a080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1ce5f28> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b1d01f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1d01e48> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1d13f28> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1d13fd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1d93d68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1da6908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1da68d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b1f23be0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1f23b00> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1f36d30> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1f36cc0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b1fb5cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b1fc55c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b1fc5588> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b2144898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b21446d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b21539e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b2153978> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b21d5978> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b21e6278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b21e6240> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b226b550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b226b390> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b22756a0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b2275630> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b22f9630> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b22ff0f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b22f9f28> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b2473f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b248a0f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b249c358> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b249c2e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b251a2e8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b251aba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b251ab70> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a9245e80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a9245da0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a926c048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b2626f60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b2642f60> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b2657860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b2657828> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b2bb0b38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b2bb0a58> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b2bc1c88> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b2bc1c18> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b2c43c18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b2c56518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b2c564e0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b30587f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b3058630> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b3067940> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b30678d0> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f98b30ea8d0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b30ea940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b30faa90> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b30faa20> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b327aa20> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b3287320> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b0789278> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98b09e22e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b05ddc88> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b05cd390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b05cd048> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b04a09b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98b04aa978> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98b04aa518> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98aecdb0f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98aecdba90> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98aaabc4e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98aaabc5c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a9313550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a8c18dd8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8c18e10> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a88de668> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a88de4e0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a88d3b70> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a88d3e10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98bfccc748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e720080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98bfccc860> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989e8cd7b8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9db7b565f8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9d27a31fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9d306944e0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98971d58d0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98971d5fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b66c7e8d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b71971f60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b71971160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b68c66940> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b68c66240> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b66c89cc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6799ec50> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6760fe48> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b790b6748> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b790b6358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b79559ba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9dd4345588> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9897085c50> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9897761e80> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897085940> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b78fd6a90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78fd6550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78f11da0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b78f11860> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b68e3e4a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b7058ecc0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b68e3ef28> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6ac80eb8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9d30700828> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b790d24a8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b790d22b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b68fd2f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b66c874a8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b795dbb00> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b79feabe0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9db781ca58> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9dc8abd908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7a1836d8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989eb386d8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98c4570be0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98c4570c18> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989eb60ba8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989eb60710> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69e75ef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69e755c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98b04b2160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98aece4240> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98aece4208> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a7cfa438> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a7cfaac8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98afdaa4a8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98afdaa240> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a8b94400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a8b91ba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7903af98> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b79510f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b79510ef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b79532f98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a633e898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a6328f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98aada8978> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98aada8940> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98978cfc50> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98978cfb70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98978c5da0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98978c5d30> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b795ded30> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b795ef630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b795ef5f8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b78fe4908> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78fe4748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78fdca58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b78fdc9e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989aa979e8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b69e8f2e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b69e8f2b0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98afc1e5c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98afc1e400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98afc11710> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98afc116a0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98c59ff6a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98c59fff60> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98c5a24390> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9897736f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9897759160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e5553c8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989e555358> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a9a1358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a9a5048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98971d4ef0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b79fd7ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b79fd7e10> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b79fe6ef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b79fe6f60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78b63fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78b5d8d0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b78b5d898> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b718f1ba8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b718f1ac8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9897771cf8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897771c88> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9896f4bc88> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9896f80588> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9896f80550> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9897477860> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98974776a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989746b9b0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989746b940> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e686940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e65d240> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989e65d208> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989af1a518> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989af1a358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989af0f668> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989af0f5f8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6903a5f8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b690140b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b718afe80> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6cfaefd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6cf950b8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6cf88320> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6cf882b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a654ffd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b795a9d30> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b795a9ba8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98afcf1e48> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98afcf1d68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98afcef080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98c0e8af28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b79ec7f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b79ede828> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b79ede7f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989aed8b00> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989aed8940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989aecec50> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989aecebe0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78e76be0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78f744e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b78f744a8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6a2037b8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a2035f8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6a209908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6a209898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a1170898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a1163198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a1163160> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a8bcf470> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a8bcf2b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a8be05c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a8be0550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a4c0550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6feee048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ff85dd8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6ff8cf98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6fe1b0f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6fe08278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6fe08208> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989f974f28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989f97aac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989f97aa90> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98afc6cda0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98afc6ccc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98afc5bef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98afc5be80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98afb98e80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98afb80780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98afb80748> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989760da58> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989760d898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98974afba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98974afb38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a815b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a82f438> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a82f400> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9899a39710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899a39550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899a10860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899a107f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ab377f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ab36208> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ab361d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6f4673c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6f467208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6c94d518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c94d4a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b7099b4a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70974198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b7181fcc0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b6b041ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6b041f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b055fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6c834e10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6d978fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6d94ea20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6d94e9e8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b77010cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b77010c18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b77012e48> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b77012dd8> False
<tensorflow.python.keras.layers.pooling.AveragePooling2D object at 0x7f9b6a1c6dd8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6a1c6eb8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6b778080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6ac83f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b6deaef28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b6deb1828> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b6deb17f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b70587b00> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70587940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70564c50> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70564be0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70ed2be0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70ef94e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70ef94a8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b690cd7b8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b690cd5f8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b690a6908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b690a6898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b78c4f898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b78c58198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b78c58160> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b707a4470> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b707a42b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b707705c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70770550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a169a550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a168a048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a58c2dd8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98a58abf98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98a5e81240> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98a5e5c278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98a5e5c208> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b70c54978> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b70c5eac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b70c5ea90> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989a49eda0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a49ecc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a485ef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a485e80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989e6bae80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989e711780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989e711748> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9b71b78a58> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9b71b78898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9b71b5bba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9b71b5bb38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899a97b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899aa7438> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899aa7400> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9897ec6710> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9897ec6550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9897ecb860> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897ecb7f0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98982c47f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989837b208> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989837b1d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989860a3c8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989860a208> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989860f518> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989860f4a8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989885d4a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898869198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98988bbcc0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98988d2ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98988d2f98> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98988e7fd0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98989a4e10> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98989e5fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98989ffa20> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98989ff9e8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9898b76cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9898b76c18> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9898b8ae48> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9898b8add8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98990a9dd8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98990bb6d8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98990bb6a0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98995459b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98995457f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899558b00> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899558a90> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98997aba90> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f98997b9390> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f98997b9358> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f98998b9668> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f98998b94a8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899a497b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899a49748> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a14e748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a14ef98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a1591d0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989a260320> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a260160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a270470> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a270400> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a2f1400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a2fd0f0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a34df98> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989a36bf98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a36bef0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a37ef98> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a5a5cf8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a5bef28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a790978> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a790940> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989a84dc50> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a84db70> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a85eda0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a85ed30> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989a95cd30> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989a96e630> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a96e5f8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989ae8c908> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ae8c748> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ae9ca58> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ae9c9e8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b2279e8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b23a2e8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b23a2b0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989b4685c0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b468400> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b46f710> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b46f6a0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b51f6a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b51ff60> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b528390> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989b753f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b769160> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b79e3c8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b79e358> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b821358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b831048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ba46ef0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989ba62ef0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ba62e10> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ba76ef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ba76f60> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989bb19fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989bb2a8d0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989bb2a898> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989bd7eba8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989bd7eac8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989bd8acf8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989bd8ac88> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989be34c88> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989be43588> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989be43550> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9899b01860> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899b016a0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899b129b0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899b12940> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899b53940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899b66240> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899b66208> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9897c4f518> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9897c4f358> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9897c58668> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9897c585f8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899e2e5f8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899e330b8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989a0abe80> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989a0c7fd0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b04d0b8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b05f320> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b05f2b0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b0c5fd0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b0dcb70> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b0dcb38> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989b327e48> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b327d68> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b340080> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b582f28> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b59ef28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b5b4828> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b5b47f0> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989b631b00> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b631940> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b862c50> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b862be0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b8e3be0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b8f44e0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b8f44a8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989bb607b8> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989bb605f8> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989bb6e908> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989bb6e898> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989bbf0898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989bc00198> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989bc00160> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989bebb470> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989bebb2b0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989bec45c0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989bec4550> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899c97550> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899ca3048> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899babdd8> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9899bc6f98> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899bdb0f0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899bed278> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899bed208> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989c9dbf28> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989c9f2ac8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989c9f2a90> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989ccadda0> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989ccadcc0> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989ccbcef0> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989ccbce80> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989cd3ce80> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f989cd50780> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989cd50748> False
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f9899e82a58> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899e82898> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899e90ba8> False
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899e90b38> False
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f9899f37b38> False
<tensorflow.python.keras.layers.core.Activation object at 0x7f9899f47438> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f9899f47400> True
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989b2d9710> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989b2d9550> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f989b2de860> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989b2de7f0> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989c0dc7f0> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f989c26f208> True
<tensorflow.python.keras.layers.convolutional.Conv2D object at 0x7f989c26f1d0> True
<tensorflow.python.keras.layers.merge.Concatenate object at 0x7f989c2093c8> True
<tensorflow.python.keras.layers.normalization.BatchNormalization object at 0x7f989c209208> True
<tensorflow.python.keras.layers.core.Activation object at 0x7f989c215550> True
Model: "DenseNet121"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
densenet169 (Model)          (None, 7, 7, 1664)        12642880  
_________________________________________________________________
global_average_pooling2d (Gl (None, 1664)              0         
_________________________________________________________________
dropout (Dropout)            (None, 1664)              0         
_________________________________________________________________
dense (Dense)                (None, 1)                 1665      
=================================================================
Total params: 12,644,545
Trainable params: 291,137
Non-trainable params: 12,353,408
_________________________________________________________________
in call backs
Epoch 1/10
 9/10 [==========================>...] - ETA: 35s - loss: 0.7858 - accuracy: 0.5308 - average_precision: 0.5874 - f1_score: 0.6269 
Epoch 00001: val_loss improved from inf to 0.81679, saving model to model_weights/best_densenet.h5
10/10 [==============================] - 376s 38s/step - loss: 0.7862 - accuracy: 0.5274 - average_precision: 0.5698 - f1_score: 0.6108 - val_loss: 0.8168 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 2/10
 9/10 [==========================>...] - ETA: 33s - loss: 0.6691 - accuracy: 0.5885 - average_precision: 0.5739 - f1_score: 0.6246 
Epoch 00002: val_loss did not improve from 0.81679
10/10 [==============================] - 360s 36s/step - loss: 0.6761 - accuracy: 0.5822 - average_precision: 0.5765 - f1_score: 0.6237 - val_loss: 0.8189 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 3/10
 9/10 [==========================>...] - ETA: 34s - loss: 0.6822 - accuracy: 0.6192 - average_precision: 0.6150 - f1_score: 0.6517 
Epoch 00003: val_loss did not improve from 0.81679
10/10 [==============================] - 368s 37s/step - loss: 0.6889 - accuracy: 0.6130 - average_precision: 0.6107 - f1_score: 0.6497 - val_loss: 0.8219 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 4/10
 9/10 [==========================>...] - ETA: 33s - loss: 0.6415 - accuracy: 0.6192 - average_precision: 0.6586 - f1_score: 0.6783 
Epoch 00004: val_loss did not improve from 0.81679
10/10 [==============================] - 357s 36s/step - loss: 0.6418 - accuracy: 0.6199 - average_precision: 0.6594 - f1_score: 0.6730 - val_loss: 0.8216 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 5/10
 9/10 [==========================>...] - ETA: 32s - loss: 0.6167 - accuracy: 0.6577 - average_precision: 0.6893 - f1_score: 0.7003 
Epoch 00005: val_loss did not improve from 0.81679
10/10 [==============================] - 350s 35s/step - loss: 0.6136 - accuracy: 0.6644 - average_precision: 0.6989 - f1_score: 0.7013 - val_loss: 0.8244 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 6/10
 9/10 [==========================>...] - ETA: 34s - loss: 0.6136 - accuracy: 0.6885 - average_precision: 0.6617 - f1_score: 0.6854 
Epoch 00006: val_loss did not improve from 0.81679

Epoch 00006: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-06.
10/10 [==============================] - 367s 37s/step - loss: 0.6096 - accuracy: 0.6918 - average_precision: 0.6661 - f1_score: 0.6896 - val_loss: 0.8398 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 7/10
 9/10 [==========================>...] - ETA: 33s - loss: 0.5880 - accuracy: 0.6577 - average_precision: 0.6901 - f1_score: 0.6751 
Epoch 00007: val_loss did not improve from 0.81679
10/10 [==============================] - 362s 36s/step - loss: 0.5870 - accuracy: 0.6507 - average_precision: 0.6878 - f1_score: 0.6682 - val_loss: 0.8421 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 8/10
 9/10 [==========================>...] - ETA: 33s - loss: 0.6700 - accuracy: 0.6692 - average_precision: 0.6489 - f1_score: 0.6552 
Epoch 00008: val_loss did not improve from 0.81679
10/10 [==============================] - 358s 36s/step - loss: 0.6542 - accuracy: 0.6781 - average_precision: 0.6562 - f1_score: 0.6661 - val_loss: 0.8450 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 9/10
 9/10 [==========================>...] - ETA: 35s - loss: 0.6383 - accuracy: 0.6731 - average_precision: 0.6792 - f1_score: 0.6571 
Epoch 00009: val_loss did not improve from 0.81679
10/10 [==============================] - 373s 37s/step - loss: 0.6381 - accuracy: 0.6747 - average_precision: 0.6745 - f1_score: 0.6620 - val_loss: 0.8485 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Epoch 10/10
 9/10 [==========================>...] - ETA: 35s - loss: 0.6173 - accuracy: 0.6577 - average_precision: 0.6913 - f1_score: 0.6853 
Epoch 00010: val_loss did not improve from 0.81679
10/10 [==============================] - 376s 38s/step - loss: 0.6249 - accuracy: 0.6541 - average_precision: 0.6847 - f1_score: 0.6792 - val_loss: 0.8507 - val_accuracy: 0.4444 - val_average_precision: 0.4688 - val_f1_score: 0.6377
Save the final weights
--------------------------------------------------------------------------------
In [162]:
valid_pred, y_valid, x_valid, valid_pred_roc = evaluateValidationData(model)
Evaluate the model on validation data
--------------------------------------------------------------------------------
Loss: 0.851, Accuracy: 0.444,  AP: 0.469, F1 Score: 0.638
Predict on the validation data
--------------------------------------------------------------------------------
2/2 [==============================] - 15s 7s/step
In [163]:
test_pred, y_test, x_test, correct, incorrect, test_pred_roc = evaluateTestData(y_valid)
Predict on the test data
--------------------------------------------------------------------------------
2/2 [==============================] - 17s 9s/step
0    20
1    16
dtype: int64
0    20
1    16
dtype: int64
Correctly predicted 16 images out of 36 images
Predicted 44% test images correctly
In [164]:
viewPredictedImage(correct)
In [165]:
viewPredictedImage(incorrect)
In [166]:
evaluateROC(valid_pred_roc,test_pred_roc,y_valid,y_test)
ROC Curve for the validation data
--------------------------------------------------------------------------------
AUC: 0.600
ROC Curve for the test data
--------------------------------------------------------------------------------
AUC: 0.406
Classification Report on the test data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        20
   Pneumonia       0.44      1.00      0.62        16

    accuracy                           0.44        36
   macro avg       0.22      0.50      0.31        36
weighted avg       0.20      0.44      0.27        36

Classification Report on the validation data
------------------------------------------------------------------------------------------------------------------------
              precision    recall  f1-score   support

      Normal       0.00      0.00      0.00        20
   Pneumonia       0.44      1.00      0.62        16

    accuracy                           0.44        36
   macro avg       0.22      0.50      0.31        36
weighted avg       0.20      0.44      0.27        36

In [ ]: